Welcome to WebmasterWorld Guest from 54.196.244.45

Message Too Old, No Replies

Google takes down front-page boast about index size

     
11:35 am on Sep 27, 2005 (gmt 0)

Senior Member

joined:May 21, 2002
posts:762
votes: 0


Google Inc. will stop boasting on its home page about the number of Web pages it has stored in its index, even as the online search engine leader continues a crusade to prove it scans substantially more material than its rivals.

[usatoday.com...]

12:14 pm on Sept 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member annej is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 17, 2002
posts:3318
votes: 0


I hope that means they will take out the listings to pages that no longer exist. I removed some pages in my sites early in the summer and they are still listed on Google.
12:33 pm on Sept 27, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 11, 2005
posts:513
votes: 0


In the SE world, bigger is not necessarily better. Googlebot has hit my error logs for the past three years looking for a page I did a 301 redirect on. It was a flash page, and I used it when flash was all the rage. It was an intro page into a section of my site. I removed it and put a 301 on it but Googlebot still asks for it.

It's gone, Googlebot. It's not coming back.

If it means loading up the SERPs with old pages and junk to be the biggest, I'll pass. I think this is a good move on Google's part.

4:57 pm on Sept 27, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:May 13, 2003
posts:442
votes: 0


They've stopped already!

And already I kind of miss it, even if the birthday cake gives me some consolidation.

5:22 pm on Sept 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 21, 2005
posts:2259
votes: 0


I hope that means they will take out the listings to pages that no longer exist

I can't really think of any other good reason they might want to do what the market may consider an embarrasing climbdown. Perhaps, maybe, just a remote possibility, but could they be considering cleaning up the supplemental index and therefore "losing "a large number of pages?

6:53 pm on Sept 27, 2005 (gmt 0)

Senior Member from MY 

WebmasterWorld Senior Member vincevincevince is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 1, 2003
posts:4847
votes: 0


What could mean a big drop in pages? Hmm... scrapers?

Take a typical scraper with 10 million pages. That being dropped is big enough to show in the figure. Just one site. 8.17 goes to 8.16 billion. (assuming US billion).

Let's take out 10 sites like this and you can see why they want to hide it.

6:58 pm on Sept 27, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 27, 2004
posts:138
votes: 0


Will be hot clean-up old supplemental pages. Forget this stupid counting war, we want quality. Some pages got penalized because G keep old pages cached.
9:26 pm on Sept 27, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 28, 2005
posts:49
votes: 0


I thought that count was wildly inaccurate anyhow? It certianly hasn't changed for a long time in any case.
11:42 pm on Sept 27, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:June 6, 2005
posts:524
votes: 1


Maybe they will be coming back with a much larger number, and in the mean time they have taken down the page-count that is lower than Yahoo!'s.

I'm thinking when the number goes back up it will be over 20 billion pages.

12:26 am on Sept 28, 2005 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


NONSENSE!

I have less than 1500 real pages, yet google has my site:www.domain.com at over 100,000. This (what I said) is not a joke.

4:15 pm on Sept 29, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member annej is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 17, 2002
posts:3318
votes: 0


All those extra pages aren't much of a problem when you are looking up most topics but when you are looking up something less usual that is when you have to wade through so many meaningless pages when perhaps some small site may have just the answer. I was looking for an unusual craft product last night and mostly got the automatically generated pages.
6:23 am on Sept 30, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:May 27, 2003
posts:503
votes: 0


When I first read this story a couple of days ago, there was one statement that just screamed out to me that Google isn't concerned about quality in their SERPs (anymore, if ever), but rather engage in a peeing match with the other engines.

[Marissa] Mayer [Google's director of consumer products], said that since apples-to-apples comparison are no longer possible, Google decided to stop listing the size of its index and instead invite Web surfers to conduct the equivalent of a "taste test" to see which engine consistently delivers the most results, Mayer said.

Now obviously I can't speak for anyone else, but personally I'm not at all interested in which engine can supply me with the most results; what I want is the best results. Being able to index every scraper site, DMOZ & Wiki clone does not equal quality - especially if that leaves me trying to find the "needle in a haystack."

I've tasted, and it's rancid.

6:39 am on Sept 30, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1678
votes: 71


balam

>>Now obviously I can't speak for anyone else, but personally I'm not at all interested in which engine can supply me with the most results; what I want is the best results.<<

Well said!

However, it seems that Google unable to deliver the expected QUALITY serps. They have been trying to achieve that especially during 2005, but results arenīt encouraging at all.

So instead of QUALITY, we see for example Mr. Eric Schmidt talking about "larger index" not "best results"!

Donīt wish to sound like an anti-google, but that what I feel sometimes (:(

7:23 am on Sept 30, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 20, 2001
posts:74
votes: 0


Google's desire to index as much as possible is counter productive and damaging web based business.

In generating duplicate URLs by modifying them it is creating index bloat, and suppressing decent websites by imposing its' flawed approach to 'duplicate content'.

One of my sites has around 50,000 pages. Google was showing around 400,000. I submitted a revised robots.txt which should have taken out about 95% of the site. Although you can no longer find these pages in their index, they now show 415,000 pages in this site!

5:21 pm on Oct 3, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:July 26, 2005
posts:486
votes: 0


I have a site with about 6000 pages and it shows 82000 in google. The Google count seems to go up about 20k pages a week.

As for Google's results.. I agree on quality over quantity. If a search engine were able to deliver the 5 most relevant results to what I wanted in my head, then it wouldn't matter if there were 100,000 other results or 100,000,000. Relevancy is infinitely more important. When I have had to go deep through the serps to find information, it is only because the relevancy sucks.

7:43 am on Oct 4, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:May 15, 2003
posts:96
votes: 0


What do you guys think, how long will it take for Goole to be back on quality results? 1 Month or much more?
11:43 am on Oct 4, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 11, 2005
posts:513
votes: 0


I have to think Google is going to start noticing a dip in traffic volume with consistently poor results.

I read a comment somewhere that the average, typical, non-Webmaster user still loves Google, and that they don't notice any difference in results. I disagree with that. For the better part of a year now, I've had people tell me Google seems to be "hosed up" and that to find good results, they're going elsewhere.

Some are going to Yahoo, some MSN, and a few have even said they are loving Ask Jeeves again.

Personally, I'd love to see more uniform traffic from the SEs, as some of them perform better in certain areas. If overall traffic were more consistent, there would be attention spent on results because that is what matters, after all.

Bigger is not better when it comes to SERPs. I'm surprised Google doesn't know this, but apparently they don't. That's why you find so much junk outranking the good core pages. Google needs to really do a thorough cleaning, and get rid of all the 404 pages, directory sites, scraper sites, and spammy sites. Then it might be relevant again.