pageoneresults - 4:08 pm on Nov 7, 2010 (gmt 0)
What are you referring here? Problem in URL rewriting? The urls that google bot attached to this message are working fine almost all are paginated urls and they have right canonical link.
I do believe the URIs are just a sampling of what it found and not the entire set. If you browse to a paginated URI that is not valid, are the proper server headers being returned?
A little spike in the mid of Oct (time spent, kilo bytes downloaded and number of pages crawled).
And this message from Google came after that "little spike" in the middle of October?
I'd be looking at technical glitches at this point in time to make sure all is okay. If you didn't make any changes that makes things a bit more challenging in determining what might be happening. Could be a glitch in GWT but as long as I've used it, the information reported is accurate. When you get a notification from Google like this, it would be cause for concern. First thing I'd be checking are server headers to make sure the bots are getting proper directives based on their requests. 200s where appropriate, 301s, 404s, 410s, etc.
From Google's John Mu...
We show this warning when we find a high number of URLs on a site -- even before we attempt to crawl them. If you are blocking them with a robots.txt file, that's generally fine. If you really do have a high number of URLs on your site, you can generally ignore this message. If your site is otherwise small and we find a high number of URLs, then this kind of message can help you to fix any issues (or disallow access) before we start to access your server to check gazillions of URLs :-).
Googlebot encountered an extremely high number of URLs on your site