Welcome to WebmasterWorld Guest from 54.167.129.169

Forum Moderators: goodroi

Message Too Old, No Replies

Google cached "disallow all" robots.txt and sitemap re-submit failed

     
11:12 am on Mar 5, 2010 (gmt 0)

New User

5+ Year Member

joined:Feb 3, 2010
posts: 3
votes: 0


Hi there,

We've re-launched our website yesterday, but I forgot to check robots.txt due to lack of time.

This morning, I opened google webmasters and tried to re-submit sitemap. It failed due to restricted robots.txt.

I realised there must be something wrong with robots.txt, and I found out it's disallowing any bot to visit our website.

On google webmasters, it says the robots txt was cached 9 hours ago. I wonder how long will it be checked again, so I will be able to re-submit our sitemap.

Hopefully this will not affect our ranking at all (24 hours of unavailability to crawl), any expert's opinion?

[edited by: jatar_k at 2:12 pm (utc) on Mar 5, 2010]
[edit reason] no urls thanks [/edit]

1:55 pm on Mar 5, 2010 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3155
votes: 129


Google's crawling rate tends to be linked to how important they think your website is. Sites like Digg and the New York Times are crawled every few minutes. Weak websites will be crawled every few days.

Your rankings will only be impacted for the pages that were blocked when Google tried to visit it. If you caught the problem quickly then very few pages will have a problem. Once you remove the block the rankings will return to normal after the next crawl cycle which is dependent on how important your website is to Google.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members