Keeping in mind that I only speak about three words of php...
What's the most painless way to log headers on a robots.txt request? Currently I've got a logheaders.php that's part of the general footer, so it's invoked whenever the server builds a page (including, by design, the 403 page, so headers will be logged on all blocked requests, even non-pages that I normally can't be bothered with). But obviously you can't Include anything in a plain-text file.
I thought of rewriting to a "robots.php" which would consist of two things: logheaders.php and the requested robots.txt. But then it all turns into run-in html onscreen and no longer looks like a plain-text file to the naked eye. It does still look like plain text in the html source, so is this actually a non-problem as far as robots are concerned?
Or am I on the wrong track entirely? (Probably...)