Yep. it's 488.
199.16.156.126 - - [27/Feb/2015:18:37:26 -0600] "GET /robots.txt HTTP/1.1" 200 488 "-" "Twitterbot/1.0
"
Now, as it turns out, one of my sites has DENY FROM 199.16.156.26 in .htaccess, but that shouldn't prevent Twitterbot from looking at my root robots.txt file.
Also, for example
66.249.75.237 - - [27/Feb/2015:09:17:46 -0600] "GET /robots.txt HTTP/1.1" 200 488 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
Now Google is never denied access at all (as far as I can tell).
Whoa. Now that I look at it more carefully, *all* requests for my robots.txt file are now getting 488'ed. Huh?
Could this be an error in my robots.txt file? I don't think I've made any big changes there, and it's a pretty short file. It is owned by Admin, rather than root. Is that right? Would that make a difference?
Now here's something weirder. Looking way back in my logs, a week ago they were also all getting 488s. But 2,3,4,5 weeks ago, they were getting 455s. 6 weeks ago they were getting 425s. I see requests for robots.txt in my logs all the time, but I've never noticed the odd status code on them. Sheesh.