Got a GWMT warning 'Googlebot found an extremely high number of URLs on your site', which is obviously cause for concern. What's puzzling is that the list of examples includes lots of URLs that are either excluded via our Robots.txt file or use parameters that should be ignored based on our parameter handling settings. Any thoughts as to what's going on and how to address the problem?
What's puzzling is that the list of examples includes lots of URLs that are either excluded via our Robots.txt file or use parameters that should be ignored based on our parameter handling settings.
They should be obeying your robots.txt unless you have used it wrongly. But, I have never seen their bots respecting those "parameter handling settings" in google webmaster tools.I don't even know why they provide that feature when their bots attempt to crawl most of those URLs.
The only solution to block them is robots.txt. But they find workarounds for that these days. For example, make sure that you don't have a +1 button on those pages. otherwise, they might not obey robots.txt
I did recently add +1 buttons to some of the pages with 'ignored' parameters... which may be why this just cropped up for those, but the robots.txt blockages have been in place forever, are fairly limited in number, and are definitely implemented correctly.
If Google considers the presence of a +1 button as grounds to ignore robots.txt exclusions, then perhaps it's ignoring your directives about which parameters on those pages should be ignored too, and the combinatorics are blowing out the URL count.