Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
I'm doing some consultancy at a small web design firm and we've just spotted a major flaw; they had a no spider robots.txt in the root of their shared CMS directory - effectively banning search engines from the site.
We've now removed this; does anyone know if there is a way to 'nudge' Google or Yahoo into checking the robot.txt again so we can then submit a sitemap and try and get these sites 'visible' to the engines again?
Thanks for any help
On a very well linked site that had the same problem as yours, Google took less than 24 hours to start indexing it. If you dont have a lot of links then the search engines may take several days/weeks before they revisit and notice the change to robots.txt.