We've re-launched our website yesterday, but I forgot to check robots.txt due to lack of time.
This morning, I opened google webmasters and tried to re-submit sitemap. It failed due to restricted robots.txt.
I realised there must be something wrong with robots.txt, and I found out it's disallowing any bot to visit our website.
On google webmasters, it says the robots txt was cached 9 hours ago. I wonder how long will it be checked again, so I will be able to re-submit our sitemap.
Hopefully this will not affect our ranking at all (24 hours of unavailability to crawl), any expert's opinion?
[edited by: jatar_k at 2:12 pm (utc) on Mar 5, 2010]
[edit reason] no urls thanks [/edit]