Welcome to WebmasterWorld Guest from 54.146.201.80

Message Too Old, No Replies

GWT: "Too Many URLs for Googlebot" - But All Are Disallowed

     
6:56 pm on Oct 29, 2008 (gmt 0)

New User

5+ Year Member

joined:Dec 11, 2006
posts: 14
votes: 0


This is driving me a little batty. I keep getting messages in the Google Webmaster Tools about Googlebot finding too many URLs on a site. However, I took action on this message a while ago and robots.txt disallowed all the offending URLs (filter combos, sorts, etc.) to try to fix this error. But I keep getting the message, now citing URLs which are all robots.txt disallowed! What should I do here? Do I need to JavaScript these URLs or switch everything to drop-downs?
8:02 pm on Oct 29, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Can you easily make a url removal request for this set of urls? I think that would fix whatever short ciruit is going on.
8:40 pm on Oct 29, 2008 (gmt 0)

New User

5+ Year Member

joined:Dec 11, 2006
posts:14
votes: 0


Unfortunately, no.
8:48 pm on Oct 29, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Well, then it will take longer. But if your robots.txt is accurate, this issue should still get sorted out over time.

If you have a WebmasterTools account, there is a good tool in there to help you verify that your robots.txt is valid and functioning as you planned.

9:07 pm on Oct 29, 2008 (gmt 0)

New User

5+ Year Member

joined:Dec 11, 2006
posts:14
votes: 0


Thanks. Also, I've already checked that the robots.txt is valid. Has anyone else with this issue seen this message stop after putting sorts/filters into drop-downs or JavaScript?