Forum Moderators: Robert Charlton & goodroi
'Googlebot found an extremely high number of URLs on your site
Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site.'
Could this be the reason for me yo-yoing in and out of the SERPS has anybody else seen this message?
Unnecessarily high numbers of URLs can be caused by a variety of issues. These include:
- Additive filtering of a set of items
- Dynamic generation of documents. This can result in small changes because of counters, timestamps, or advertisements.
- Problematic parameters in the URL. Session IDs, for example.
- Sorting parameters. Some large shopping sites provide multiple ways to sort the same items
- Irrelevant parameters in the URL, such as referral parameters.
- A dynamically generated calendar might generate links to future and previous dates with no restrictions on start of end dates.
- Broken relative links.
[google.com...]
So is this a reason for a Yo-Yo problem in the SERPs? Only Google can say for sure, but I would definitely encourage you to fix the issue and find out.
I have put the nofollow tag on all those pages and I have also added it to my robots.txt disallow list. Fingers crossed this should fix my problems.
"Would this cause the yo-yoing that I have had since xmas?"
We're blocking the indexing of navigation and all that other stuff with new releases. Gotta a neat little setup going that I think is going to prevent all of the above. We will see as we move forward with a recent launch.
The strange thing is guys, is that all these pages already used the noindex, follow tag.
Prior to the warning? If so, can you show us the code you are using? Just the metadata element and the order you have the directives. I know you said noindex, follow and if that is the case, I think something may be broken with comma separated directives. I too have seen stuff getting indexed, being followed, yada, yada, yada, even after having various metadata elements to noindex, or nofollow, or none, you name it. I use that element judiciously.
Googlebot found an extremely high number of URLs on your site.
Seems there not happy with me taking up G bots time.
<meta name="robots" content="noindex, follow" />
You know, I was thinking something else originally. < That happens more frequently these days. After reading through the topic again, I might agree with your assessment. Maybe there is a disproportionate number of allowed, noindex and the follow.
Maybe some sort of duplication occuring in the follow routine? Looping? This would be a dynamic site yes? Is there a rewrite involved? Have you double checked everything? I've seen Google Webmaster Tools flag things that have actually been problems for Webmasters. It's a nice feature to have available to you when things are accurate. ;)