Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google's results seem to keep shrinking - 99.9% getting filtered

         

goodroi

9:01 pm on Sep 29, 2016 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



It is a bit interesting to see how Google's results are decreasing in scope. One example of these shrinking results is if you tried to research <cough>cyberstalk ;)</cough> Matt Cutts you will find Google only returns about 200 results. Google says they have "About 422,000 results" and if you start scrolling through the serps it looks like it goes on & on but after about 200 results it stops showing matches. 200 out of 422,000, that means 99.9% of results are being filtered away

This isn't exactly new. Google has been working on leaner serps for quite sometime. They have played around with fewer results per page and have been lowering the total results shown for sometime. I'm bringing this up because I don't see too many people noticing this squeeze. For many it might not make a difference but for me it is a good reminder to keep ensuring that all of my pages are unique/useful/engaging/powerful so I have the best chances of getting past the 99.9% filter.

iamlost

1:21 am on Sep 30, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There are a couple of interesting moves that have been in play for years:
1. the default query return has long been capped at 1000.
Note: the 'about [ n-number ] results' is pure marketing bumph and always has been.
However, the average searcher doesn't go past the third page, many the first page viewport. Which tells us that at least 1000 - 30 = 970 (in a perfect varied return) or 97% of results are never seen. or, the other way: on average the top 3% of a query return has a realistic chance of being viewed. For the majority (~75%) of searches, of course, the results are even bleaker: 1000 - 5 = 995 or 99.5% are ignored; only 0.5% are clicked.

Of course the SEs know this so that they don't actually load those 1000 returns but whatever the first page results is set for (default: 10). The searcher actually has to make a further request for each page of additional results; a built in barrier. These days results pages are full of diversions ahead of the ability to request those additional answers: ads on top, 'in the news', 'people also ask for', 'searches related to [ query ]', etc.

And if one takes the time to click through all those pages offered behind the Google o's one finds that (1) there aren't actually ten pages worth available despite those 10 o's and/or (2) repetition of domains shrinks the available resource by at least a third, often two-thirds. Just like that 'about [ n-number ] results' the actual (1000) number is an illusion.

2. personalisation, from Universal Search on has created personalised filter bubbles such that (mostly) no two people see the same results for the same query even if made within seconds of each other. The more data points available to the SE the greater the disparity in results. Discovery exists only for a bounded fleeting moment until sufficient personalisation bindings can be applied.

Between personalisation and conservation of resources aka limiting return size general search engines are increasingly high walled gardens each with one or a very few people inside. An analogy is that many are held to house arrest, some can range their neighbourhood, a few their city... beliefs are reinforced, opinions validated, the past is here, the future unavailable. Besides being #*$! irritating it is hazardous to the greater good... except for the business good of the gardener.

SEs have largely changed their purpose, the money, after all, is in popularity not discovery; increasingly those (OK, relatively few) interested in discovery are finding new channels and sharing, surprisingly similar to the behaviours of 20 years ago.

Note: such behaviour is not restricted to SEs, take a read around of what should be a web leader - webdev - and all the popular stuff is years out of date, often rehashed in an attempt at pseudo-relevance; the cutting edge stuff is largely either poo-poo-ed or ignored or misunderstood after a quick scan, the bleeding edge might as well be science fiction so far as most webdevs are concerned. They reside somewhere between 2005 (pre-smartphone) and 2012 (Penguin) for the most part. One doesn't know whether to laugh hysterically or cry the same.

JesterMagic

10:51 am on Sep 30, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Good summary iamlost. I think mobile should be added to the list as well since they now account for over half of all search queries. With all the small screens and clutter I think most mobile users barely get past the first page. Heck most probably don't scroll down. You also have prime spots taken up in the serps by listing for apps now days, and usually not just one spot but one for Android, iTunes, and Amazon.

Also I see more an more people using the Google Search Box (I forget the exact name) on Android phones now which combines results with other things which further muddies the serps.