Welcome to WebmasterWorld Guest from 22.214.171.124
It's gone, Googlebot. It's not coming back.
If it means loading up the SERPs with old pages and junk to be the biggest, I'll pass. I think this is a good move on Google's part.
I hope that means they will take out the listings to pages that no longer exist
I can't really think of any other good reason they might want to do what the market may consider an embarrasing climbdown. Perhaps, maybe, just a remote possibility, but could they be considering cleaning up the supplemental index and therefore "losing "a large number of pages?
I have less than 1500 real pages, yet google has my site:www.domain.com at over 100,000. This (what I said) is not a joke.
[Marissa] Mayer [Google's director of consumer products], said that since apples-to-apples comparison are no longer possible, Google decided to stop listing the size of its index and instead invite Web surfers to conduct the equivalent of a "taste test" to see which engine consistently delivers the most results, Mayer said.
Now obviously I can't speak for anyone else, but personally I'm not at all interested in which engine can supply me with the most results; what I want is the best results. Being able to index every scraper site, DMOZ & Wiki clone does not equal quality - especially if that leaves me trying to find the "needle in a haystack."
I've tasted, and it's rancid.
>>Now obviously I can't speak for anyone else, but personally I'm not at all interested in which engine can supply me with the most results; what I want is the best results.<<
However, it seems that Google unable to deliver the expected QUALITY serps. They have been trying to achieve that especially during 2005, but results arenīt encouraging at all.
So instead of QUALITY, we see for example Mr. Eric Schmidt talking about "larger index" not "best results"!
Donīt wish to sound like an anti-google, but that what I feel sometimes (:(
In generating duplicate URLs by modifying them it is creating index bloat, and suppressing decent websites by imposing its' flawed approach to 'duplicate content'.
One of my sites has around 50,000 pages. Google was showing around 400,000. I submitted a revised robots.txt which should have taken out about 95% of the site. Although you can no longer find these pages in their index, they now show 415,000 pages in this site!
As for Google's results.. I agree on quality over quantity. If a search engine were able to deliver the 5 most relevant results to what I wanted in my head, then it wouldn't matter if there were 100,000 other results or 100,000,000. Relevancy is infinitely more important. When I have had to go deep through the serps to find information, it is only because the relevancy sucks.
I read a comment somewhere that the average, typical, non-Webmaster user still loves Google, and that they don't notice any difference in results. I disagree with that. For the better part of a year now, I've had people tell me Google seems to be "hosed up" and that to find good results, they're going elsewhere.
Some are going to Yahoo, some MSN, and a few have even said they are loving Ask Jeeves again.
Personally, I'd love to see more uniform traffic from the SEs, as some of them perform better in certain areas. If overall traffic were more consistent, there would be attention spent on results because that is what matters, after all.
Bigger is not better when it comes to SERPs. I'm surprised Google doesn't know this, but apparently they don't. That's why you find so much junk outranking the good core pages. Google needs to really do a thorough cleaning, and get rid of all the 404 pages, directory sites, scraper sites, and spammy sites. Then it might be relevant again.