Welcome to WebmasterWorld Guest from 54.161.255.61

Forum Moderators: goodroi

Google cached "disallow all" robots.txt and sitemap re-submit failed

   
11:12 am on Mar 5, 2010 (gmt 0)

5+ Year Member



Hi there,

We've re-launched our website yesterday, but I forgot to check robots.txt due to lack of time.

This morning, I opened google webmasters and tried to re-submit sitemap. It failed due to restricted robots.txt.

I realised there must be something wrong with robots.txt, and I found out it's disallowing any bot to visit our website.

On google webmasters, it says the robots txt was cached 9 hours ago. I wonder how long will it be checked again, so I will be able to re-submit our sitemap.

Hopefully this will not affect our ranking at all (24 hours of unavailability to crawl), any expert's opinion?

[edited by: jatar_k at 2:12 pm (utc) on Mar 5, 2010]
[edit reason] no urls thanks [/edit]

1:55 pm on Mar 5, 2010 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Google's crawling rate tends to be linked to how important they think your website is. Sites like Digg and the New York Times are crawled every few minutes. Weak websites will be crawled every few days.

Your rankings will only be impacted for the pages that were blocked when Google tried to visit it. If you caught the problem quickly then very few pages will have a problem. Once you remove the block the rankings will return to normal after the next crawl cycle which is dependent on how important your website is to Google.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month