Welcome to WebmasterWorld Guest from 54.226.194.180

Message Too Old, No Replies

GWT: "Too Many URLs for Googlebot" - But All Are Disallowed

     

eileenw

6:56 pm on Oct 29, 2008 (gmt 0)

5+ Year Member



This is driving me a little batty. I keep getting messages in the Google Webmaster Tools about Googlebot finding too many URLs on a site. However, I took action on this message a while ago and robots.txt disallowed all the offending URLs (filter combos, sorts, etc.) to try to fix this error. But I keep getting the message, now citing URLs which are all robots.txt disallowed! What should I do here? Do I need to JavaScript these URLs or switch everything to drop-downs?

tedster

8:02 pm on Oct 29, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Can you easily make a url removal request for this set of urls? I think that would fix whatever short ciruit is going on.

eileenw

8:40 pm on Oct 29, 2008 (gmt 0)

5+ Year Member



Unfortunately, no.

tedster

8:48 pm on Oct 29, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Well, then it will take longer. But if your robots.txt is accurate, this issue should still get sorted out over time.

If you have a WebmasterTools account, there is a good tool in there to help you verify that your robots.txt is valid and functioning as you planned.

eileenw

9:07 pm on Oct 29, 2008 (gmt 0)

5+ Year Member



Thanks. Also, I've already checked that the robots.txt is valid. Has anyone else with this issue seen this message stop after putting sorts/filters into drop-downs or JavaScript?
 

Featured Threads

Hot Threads This Week

Hot Threads This Month