homepage Welcome to WebmasterWorld Guest from 54.166.111.111
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google has cut down number of results
Displaying about 600
youfoundjake




msg:3480025
 5:59 pm on Oct 17, 2007 (gmt 0)

Searching for my key phrase, 5 million results found, about 590 displayed.
Searching for SEO, 47 millions results found, 570 displayed.
No more 600-999, did I miss something?

 

tedster




msg:3480067
 6:28 pm on Oct 17, 2007 (gmt 0)

This was mentioned a few weeks ago as well, but inside another thread - not as the main topic. Still,since that post I ahve been pondering what this change might be about.

Yes, I see this kind of change, too. However the search for "SEO" is returning in the 900's for me right now. Still, it's not showing all 1,000. I'm glad you posted about this, because my sense is that this is a visible sign of a significant change on Google's back end, and not just an attempt to limit our access to their data.

Two questions I wonder about:

1. What criteria are used to choose the size limit for that initial result set?

2. Is this tied to some of the "re-ranking" we see, such as in the "-950 penalty"?

For #1, I'm considering that the preliminary result set that a query generates gets truncated at some spot determined by the "relevance scores" of the urls returned initially. Once those scores get too low, then Google won't include the url, even if it does, striclty speaking, contain the search terms.

For #2, (my thinking here is based on a number of patents) it's possible that Google is using more types of re-ranking over the preliminary result set. By truncating the size of the set, they've lowered computational overhead.

youfoundjake




msg:3480303
 10:54 pm on Oct 17, 2007 (gmt 0)

Odd, ok, here is an example that makes no sense.

I did a search for news
1.1 billion results.
At the 6 page, i get always lovely "In order to show you the most relevant results, we have omitted some entries very similar to the 530 already displayed.
If you like, you can repeat the search with the omitted results included."
Click on that, and I get pages 1-5.
5th page gives me
Personalized Results 401 - 458 of 458 for news [definition]. (0.17 seconds)

Not only is it truncating results, but dropping basically 1.6+ billion pages.

tedster




msg:3480354
 12:34 am on Oct 18, 2007 (gmt 0)

For me the "news" search truncates at 905 results and gives me an omitted results link. However, clicking on that link (which essentially adds &filter=0 to the url) then truncates results at 753 results -- 152 fewer. And that's what I get, instead of a list that now folds in the previously omitted results! I'd say that's another sign that the basic infrastructure has shifted. More analysis of how the 905 and the 753 compare might be interesting at some point.

However, Google does not exist to supply lots of data to the likes of you and me. They want to supply information to ordinary end-users, people searching for information. Those folks rarely care about page 19 of the results, so what Google is doing here does not impact them. If Google makes changes that allow them to answer more searches in a faster amount of time, the end-user will like that.

[edited by: tedster at 4:59 am (utc) on Oct. 18, 2007]

youfoundjake




msg:3480426
 1:57 am on Oct 18, 2007 (gmt 0)

Tedster, what DC are you looking at?
Also, how can I tell what DC I'm looking at? :P
As you brought up before with the -950 penalty, could what we are seeing be the result of that filter's dial being turned WAY up?
Can an update or synch also cause this?

tedster




msg:3480507
 5:10 am on Oct 18, 2007 (gmt 0)

OK - now I see 605 results for "news" and 742 including the omitted results. That's on 64.233.169.103. You can see the IP address of your current web page by using Firefox and installing the ShowIP extension.

Querying an IP directly often gives different results than by letting google.com be direct through their load balancing and so forth. In this case, I get 608 initial results (3 more) and the same 742 when I include the omitted results.

No I don't think this is directly related to the 950 re-ranking being "turned up". Instead, my guess is that a shorter preliminary list makes such re-ranking computationally easier. People with the -950 still see their previous first page results at the end, but that end just comes quicker.

youfoundjake




msg:3482908
 8:26 pm on Oct 20, 2007 (gmt 0)

One thing I'm noticing during the drop of displayed results, and that the NUMBER of results are dropping significantly as well. The most drastic phrase I track has gone from almost 19 million results down to 3 million.
I'm wondering if Google is hiding more of the background to prevent manipulation, i.e, work on a niche that has low results so you can rank higher due to less competition.
Just wondering if anyone else has seen the drop in results for some of their terms.

trinorthlighting




msg:3483091
 12:49 am on Oct 21, 2007 (gmt 0)

They should cut it down to 100, how many users ever go beyond that?

youfoundjake




msg:3493265
 2:47 am on Nov 1, 2007 (gmt 0)

They should cut it down to 100, how many users ever go beyond that?

Pretty close to that now. My search phrase went from peaking at 19 million results to now only 450,000.
There is a lot of pruning going on.

trinorthlighting




msg:3493282
 3:23 am on Nov 1, 2007 (gmt 0)

It will hide the -950 penalty for those who are affected.

tedster




msg:3493292
 3:35 am on Nov 1, 2007 (gmt 0)

I think it will test some theories of the -950 penalty. If the urls concerned used to show in the high 900's for 1,000 results, and they now show in the 600's for 650 results (that's my guess), that will tell us something. And if they don't, then that also will tell us something.

[edited by: tedster at 5:11 am (utc) on Nov. 1, 2007]

potentialgeek




msg:3493306
 3:49 am on Nov 1, 2007 (gmt 0)

They should cut it down to 100, how many users ever go beyond that?

I think Google checked its data for the typical cut-off points by searchers and "cut the fat."

The deepest search ever I can recall from raw logs was about 720. However I rarely see past 30. I'm sure a lot of searchers don't even get to page 2 of results, but I'd like Google to share its data.

p/g

Marcia




msg:3493352
 6:02 am on Nov 1, 2007 (gmt 0)

>>You can see the IP address of your current web page by using Firefox and installing the ShowIP extension.

Using FF it's been fine, but I'm seeing a *very* wierd result for one search term (pure, unfiltered garbage), only in IE and it seems to be the data center I'm accessing with IE consistently, whether or not I'm logged in.

zett




msg:3493386
 7:44 am on Nov 1, 2007 (gmt 0)

Interesting. My stats package automatically looks at the (estimated) page # in Google.

Here's the breakdown:

Search term leading to my site is on...
Page 1: 85.0%
Page 2: 7.1%

Page 3: 3.0%
Page 4: 1.4%
Page 5: 0.8%
Page 6: 0.6%
Page 7: 0.4%
Page 8: 0.2%
Page 9: 0.2%
Page 10: 0.2%
beyond Page 10: 1.1%

My stats also show that just about 8%-9% of the searches come from Page 3 or beyond. Tendency slightly decreasing over the past months. This is perfectly normal behaviour IMO. When using Google I think the SERPs on page 3 or beyond are of lower quality than the first two pages, and so I just do not think it's worth to try a result on these pages.

callivert




msg:3493429
 10:06 am on Nov 1, 2007 (gmt 0)

They should cut it down to 100, how many users ever go beyond that?

I do from time to time. Usually it's because I'm researching a topic, and I want to see a variety of websites that all talk about the same thing.
There are many different purposes for doing a search in Google. It's not always about getting an instant answer.

algarveb




msg:3498603
 11:49 am on Nov 7, 2007 (gmt 0)

This morning we have noticed a massive drop in the number of SERP results available for any given search. For our keywords 'blue widget' and 'widgets in blue' we would normally see 2.2 or 2.7 million results - today both have dropped to less than 350,000.

Is anyone else seeing this and what does it mean?

At the moment, there do not seem to be any significant changes in SERP results - at least not on pages 1 and 2.....

[edited by: tedster at 5:04 pm (utc) on Nov. 7, 2007]
[edit reason] moved from another location [/edit]

youfoundjake




msg:3499077
 9:20 pm on Nov 7, 2007 (gmt 0)

One thing to try, is to sign out of google, i did notice that the search number did increase when it was not using personal search. But as it stands right now, I'm sitting at 550,000 either way after it peaking at 19 million.

bwnbwn




msg:3499162
 10:41 pm on Nov 7, 2007 (gmt 0)

I haven't seen this all day on any search I have done.

BradleyT




msg:3499382
 6:41 am on Nov 8, 2007 (gmt 0)

I was showing a guy at work this "trick" just the other day. You used to be able to use this to see how tough SEO might be for a particular phrase as 3-4 word phrases would usually show the 1.2 million results but using 100 results per page you often wouldn't get anything shown past 200 or 800 or 1200.

So we tried this with a long term which we eventually shortened down to just the single word computer which gives 872 results without ommitted - obviously not an accurate result if you're considering SEO difficulty. I didn't have an explanation for the guy as to why it was so few and he probably thought I was an idiot lol. Good to see a thread about this.

Miamacs




msg:3499618
 2:06 pm on Nov 8, 2007 (gmt 0)

Didn't quite get what this thread is about... ( overall number of results or results displayed? ), so excuse me if I'll be stating the obvious.

No more 600-999, did I miss something?

It's been like this for years on generic, ultra competitive searches.
I don't get it,... did *I* miss something?

1. What criteria are used to choose the size limit for that initial result set?

TrustRank.

OK, that's the method. Criteria are... well... SPAM and aggressive SEO activity. What else. *grin*
Anything that can manipulate relevancy and PageRank but not TrustRank. That's what it's for. See below.

2. Is this tied to some of the "re-ranking" we see, such as in the "-950 penalty"?

The -950 penalty, it has no connections with.

The rest... I don't think so... in what way could it be?
It's partially a manual setting though so...

...returning in the 900's for me right now

This threshold is virtually a runtime setting.
It can change several times a day, or be left alone for weeks.
In an area, generic travel sector, I watch an ultra competitive phrase that'll show any number of results between 97 (!) and 900'something.

...

The number of results displayed is based on the trust threshold set for a given search. The initial - relevancy + trust + whatnot - set is always much larger of course.

This will be first filtered for sites that don't clear the required parameter ( trust ), and only *then* comes the application of some rerankings.

Thus the list will only show results from *domains* and within them, *URLs* that are trusted.
Sometimes compromising some more relevant sites that'd have ended up at good positions otherwise...

And to newbies reading this, guess what, this is what we've been calling the 'sandbox effect'.

Trust increases as links/sites age, thus sometimes sites with no *new* links suddenly clear the threshold. But it's not a time limit.

The final list will show results from only the most trusted domains... Apply the &filter=0 parameter and you did a site: search on them, but only them.

The top list is always clipped at a point that feels convenient, more often than not because after that position, there's some irregular SPAM activity ( or too many people reaching the 'pro-level' in any other 'unfair' way - Google's interpretation of fair, not mine. ) It's effective allright. A little too effective in some cases, but most of the time I'm thankful it's there.

Rerankings like the -950 are applied in a very interesting way. A site that would be #1, but is filtered to be -950 will show up #355 on SERPs that end at position 355. You know, that's because sites that go -950 ( for popular 1,2 word phrases ) are in fact, trusted. As I've been saying for almost a year. But whatever.

...

Or was this all off topic?

[edited by: Miamacs at 2:14 pm (utc) on Nov. 8, 2007]

g1smd




msg:3500203
 12:55 am on Nov 9, 2007 (gmt 0)

Several searches that I did a few days ago, now report the number of results having dropped by some 70% or more.

I assume these are errors in the estimates. This effect has happened before, several times, over the last few years.

I think that at least one of the significant changes in the reported counts, a year or two back, co-incided with a major update to Supplemental Results.

steveb




msg:3500249
 2:29 am on Nov 9, 2007 (gmt 0)

It's back to commonly about 1000

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved