joined:July 31, 2006
I have robots.txt file for a domain. The robots.txt file contains 230 Disallow statements, which are all valid syntactically. Googlebot routinely reads this file, and WMT indicates that the processing of it is "successful".
My problem is that an entry that was in an old version of robots.txt several months back, is getting blocked, when in fact, I do not want it to get blocked.
For whatever reason, it seems like this old version of robots.txt is actually being used by Google, despite the fact that I've made many changes to it over the last month, and it has been spidered by Google.
Is there a standard period of time that typically needs to elapse, before a new version of robots.txt becomes the defacto standard for the site ? Is there something that I can do to force Google to use this new version ?
Thanks in advance !