Msg#: 3334730 posted 11:20 am on May 9, 2007 (gmt 0)
I'm doing some consultancy at a small web design firm and we've just spotted a major flaw; they had a no spider robots.txt in the root of their shared CMS directory - effectively banning search engines from the site.
We've now removed this; does anyone know if there is a way to 'nudge' Google or Yahoo into checking the robot.txt again so we can then submit a sitemap and try and get these sites 'visible' to the engines again?
Msg#: 3334730 posted 12:39 pm on May 9, 2007 (gmt 0)
The search engines always request the robots.txt when they visit a site. The best way to get the search engines to visit your site is to get mroe links to your site.
On a very well linked site that had the same problem as yours, Google took less than 24 hours to start indexing it. If you dont have a lot of links then the search engines may take several days/weeks before they revisit and notice the change to robots.txt.