---- Lost all rankings from Google - due to robots.txt
g1smd - 10:44 am on Jun 11, 2012 (gmt 0)
Instead remove the "Disallow /" from it leaving a robots.txt file that explicitly allows crawling:
The correct syntax is:
User-agent: * Disallow:
with at least one blank line after the Disallow directive.
Fix the robots.txt file. Wait 48 hours. Go to WMT and the use the "Fetch as Googlebot" function to retrieve your root page (www.example.com/) and then click on the "submit page and all linked pages" option.
I can't count the number of times I have accidentally uploaded the wrong robots.txt file to a site. However, the error has almost always been corrected within a few minutes. Even so, there have been a couple of occasions where Google had already grabbed the file seconds before the corrections were applied. In those cases it took 24 hours for Google to revisit and get the right version of the file. It's a shame there isn't a WMT button that says "I've messed up my robots.txt file; please discard the last version and grab the corrected one as soon as possible". If an incorrect file is corrected within 24 hours there appears to be no damage done.