Forum Moderators: Robert Charlton & goodroi
Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site.
Live with it and don't complain.
Over the last three years I've migrated 75% of "ad income" away from Google and saw a 22% increase for income.
I'd double check and make sure you don't have malformed syntax somewhere which is causing the bot to get hung up in some sort of black hole
Did you make any changes since the time that message appeared?
Do you see an initial spike and then lots of spikiness afterward?
What are you referring here? Problem in URL rewriting? The urls that google bot attached to this message are working fine almost all are paginated urls and they have right canonical link.
A little spike in the mid of Oct (time spent, kilo bytes downloaded and number of pages crawled).
We show this warning when we find a high number of URLs on a site -- even before we attempt to crawl them. If you are blocking them with a robots.txt file, that's generally fine. If you really do have a high number of URLs on your site, you can generally ignore this message. If your site is otherwise small and we find a high number of URLs, then this kind of message can help you to fix any issues (or disallow access) before we start to access your server to check gazillions of URLs :-).
And this message from Google came after that "little spike" in the middle of October?
[edited by: jmccormac at 4:52 pm (utc) on Nov 7, 2010]
How would you deal with paginated urls on your site?
On a side note, I noarchive everything these days - everything.
...should I just use <meta name="robots" content="noindex"> tag on paginated urls?