homepage Welcome to WebmasterWorld Guest from 54.205.247.203
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Old Robots.txt file used by Google - not newer version
doughayman




msg:4129041
 10:28 pm on May 7, 2010 (gmt 0)

Hi,

I have robots.txt file for a domain. The robots.txt file contains 230 Disallow statements, which are all valid syntactically. Googlebot routinely reads this file, and WMT indicates that the processing of it is "successful".

My problem is that an entry that was in an old version of robots.txt several months back, is getting blocked, when in fact, I do not want it to get blocked.

For whatever reason, it seems like this old version of robots.txt is actually being used by Google, despite the fact that I've made many changes to it over the last month, and it has been spidered by Google.

Is there a standard period of time that typically needs to elapse, before a new version of robots.txt becomes the defacto standard for the site ? Is there something that I can do to force Google to use this new version ?

Thanks in advance !

 

tedster




msg:4129083
 12:06 am on May 8, 2010 (gmt 0)

Do you mean that WMT says the URL is blocked by robots.txt? Or do you mean that Google still isn't requesting that URL from your server?

If it's just the first, even though the new version has been spidered, then it may be only a reporting problem. But if googlebot isn't requesting the URL, that's a different situation.

doughayman




msg:4129090
 12:26 am on May 8, 2010 (gmt 0)

Ted, WMT says that the URL is blocked by robots.txt, even though I removed that explicit restriction in robots.txt several months ago.

tedster




msg:4129092
 12:31 am on May 8, 2010 (gmt 0)

OK. So the next step would be "is this a buggy report?" In other words, is googlebot requesting the URL anyway - and is it indexed?

doughayman




msg:4129108
 1:23 am on May 8, 2010 (gmt 0)

Thanks, Ted. It looks like it is a buggy report for some of the "effected" URL's (they ARE being spidered and indexed), and for others, they are not being requested at all by Googlebot, and it has been several months since the "Disallow" clause for them has been removed from robots.txt. Once again, Google is extremely hard to figure out, and reliability here is mighty questionable. Thanks for your input. I was wondering if others have had similar issues, and if there was eventually resolve.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved