JAB_Creations - 4:28 am on Sep 5, 2011 (gmt 0)
Do your robots.txt files utilize a database for any reason? If so do you record MySQL errors?
If your robots.txt file uses a service that temporarily becomes unavailable do you know how your site will react? In example by using the base element my entire site works seamlessly both locally and live with the exact same code; I have tested my site disabling the database and then loading pages (since my site is now database driven). My site actually is aware and able to fall back in to a "safe mode" of sorts if the database becomes unavailable for that request.
If that is the situation that's happening then I would recommend serving an HTTP 503 header presuming Google will wait a few moments/minutes before trying again instead of simply presuming what you're seeing.