homepage Welcome to WebmasterWorld Guest from 54.145.183.190
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 40 message thread spans 2 pages: 40 ( [1] 2 > >     
Are unstable Google rankings a sign of fundamental Google problems?
aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 2:55 pm on Nov 16, 2013 (gmt 0)


System: The following 5 messages were cut out of thread at: http://www.webmasterworld.com/google/4623685.htm [webmasterworld.com] by robert_charlton - 10:33 am on Nov 16, 2013 (PST -8)


Unstable search results, in which rankings repeatedly undergo drastic changes, are a clear sign of fundamental problems in Google's approach. They keep claiming that the results are great, and yet keep making major changes in their ranking algorithm. If the results are so great, why would so many major changes be necessary? They seem to be forever stumbling around in the dark trying to find a solution, but most of the changes only make their results worse.

 

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4623820 posted 3:34 pm on Nov 16, 2013 (gmt 0)

Unstable search results, in which rankings repeatedly undergo drastic changes, are a clear sign of fundamental problems in Google's approach.


Surely nobody (even the folks at Google) think Google's search results are perfect. Doesn't it make sense that Google wouldn't freeze its algorithm and filters for the convenience of site owners who take comfort in fixed rankings?

The Web is fluid. Why shouldn't search results be fluid? Just as important, if users were unhappy with "unstable" search results, wouldn't their dissatisfaction have an impact on Google's share of the search market?

Personal opinion: Google's search results are a long, long way from perfect, and I'd hate to see Google abandon its attempts to make them better.

aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 3:56 pm on Nov 16, 2013 (gmt 0)

I've seen many threads on this forum where someone reported that pages on their site had been at or near the top of the SERPs for their main keywords for many years, then suddenly overnight fell hundreds of positions, even though there had been no recent changes to the site. If you go to Google's "official" forum, you will see a lot more of these cases than you do here.

Let me ask a question:
If Google consistently ranks a page number 1 for ten years (for its main keyword), then overnight that page falls to number 600, does it mean that for ten years that page was the best result on the web for that query, but now suddenly there are 599 better results?

Google has been working on their algorithm for about 15 years. Yet there are still occasional major shakeups such as Panda and Penguin. I didn't say they should freeze their search results. But after 15 years, you would think there would be more stability. Yes, there would be shifts, but they would be slow and gradual, not sudden and drastic. Something is fundamentally wrong with their approach.

SevenCubed

WebmasterWorld Senior Member



 
Msg#: 4623820 posted 4:22 pm on Nov 16, 2013 (gmt 0)

If Google consistently ranks a page number 1 for ten years (for its main keyword), then overnight that page falls to number 600, does it mean that for ten years that page was the best result on the web for that query, but now suddenly there are 599 better results?

Wow. Most excellent comment! That one forced me back in my chair. A fluid web, guided by an evolving algorithm, wouldn't behave like that. That post is one of those rare ones (for me) that stops an argument in it's tracks, even if I was on the wrong side of the argument. Thanks so much for that treasure :)

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4623820 posted 4:24 pm on Nov 16, 2013 (gmt 0)

If Google consistently ranks a page number 1 for ten years (for its main keyword), then overnight that page falls to number 600, does it mean that for ten years that page was the best result on the web for that query, but now suddenly there are 599 better results?


It could simply mean that the site has been hit with a penalty. It could also mean that Google's results for [keyword] were lousy for ten years and are beginning to improve, or that conceptual changes such as Hummingbird are making the notion of a page's ranking "for its main keyword" an anachronism.

Most of us can point to examples where lousy results seem to be almost hardcoded into the SERPs--and no, I'm not talking about the likes of Amazon or Wikipedia. I'm talking about junk pages that look completely out of place on the first page of search results. In such cases, I'd like to see more flux, not less.

londrum

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4623820 posted 8:32 pm on Nov 16, 2013 (gmt 0)

If Google consistently ranks a page number 1 for ten years (for its main keyword), then overnight that page falls to number 600, does it mean that for ten years that page was the best result on the web for that query, but now suddenly there are 599 better results?

best for who? imagine if ten people searched for a relatively simple phrase like "red widgets". what is the best for one person might not be the best for another. they could all be looking for slightly different things (prices? info? pictures? reviews?)
so it's possible that there are many "best" sites for each query. i think it makes sense to return different sites at no.1 (but dropping them 600 places would seem a bit over the top!)

aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 9:42 pm on Nov 16, 2013 (gmt 0)

best for who?

The question referred to Google's "general" rankings which presumably reflect their algorithm's assessment of what's best for the typical "average" searcher. Talking about different search results for different people doesn't really relate to the basic question of the thread.

austtr

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4623820 posted 11:03 pm on Nov 16, 2013 (gmt 0)

IMO Google's continual changing of the way that search results are derived is driven by three fundamental issues:

1) The need to increase profits, maintain their share price and keep Wall St happy. As a publicly listed company, they have no choice in this.... how they do it is a whole other topic.

2) The need to keep ahead of the "bad guys" that see Google as a money pit and will use any trick that works to their advantage. Google will do whatever it takes, and collateral damage be damned, to prevent this and regular SERP upheaval is a way to do that. The skewing of results to favour big brands was another way of locking out spam sites.

3) The Google of today only pays lip service to search intergrity. They are an online advertising platform and have been for some time. The real value in SERP's is to provide screen real estate that carries Google's products and services. Organic results have been made largely irrelevant as they are pushed further and further from the public view.

kawen



 
Msg#: 4623820 posted 10:26 am on Nov 17, 2013 (gmt 0)

Adwords,

NegSeoExists0



 
Msg#: 4623820 posted 2:45 pm on Nov 17, 2013 (gmt 0)

very good point

it has been top quality for 10 years and suddenly drop 600 position due to get penalized

wow this is so wrong

and the answer is adwords as kawen mentioned

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4623820 posted 4:01 pm on Nov 17, 2013 (gmt 0)

and the answer is adwords as kawen mentioned


Nope. That argument would make sense only if competing pages didn't gain free traffic by moving up in the SERPs, thereby reducing their need for AdWords.

np2003

10+ Year Member



 
Msg#: 4623820 posted 5:32 pm on Nov 17, 2013 (gmt 0)

Google web spam team has been on a crusade to murder the SEO industry and as a result, unstable rankings are now common. Its not very hard to take down a competitor site these days. Just buy a few sponsored wordpress themes with their keywords on the footer and give it away for free or spend $50 on fiver for thousands of xrumer links and within 6 months, the site is TOAST (won't be on the first 20 pages guaranteed).. This works every time.. Google knows this, but to them its small collateral damage.

NegSeoExists0



 
Msg#: 4623820 posted 6:28 pm on Nov 17, 2013 (gmt 0)

@EditorialGuy that would be true if the ranking sites werent spam and wouldnt never ever spend money on adwords

you see my point ? those who get free traffic already in most cases spam, churn and burn sites that wouldnt spend money on adwords

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 6:44 pm on Nov 17, 2013 (gmt 0)

Google rankings are stable
This discussion is faulty because the underlying assumption that the rankings are unstable is incorrect. The fact is that 99% of rankings are unchanged during most algorithmic changes. Go back and read the announcements. Most of the algo announcments state the changes affect single digit percentage of searches.

There are problems but not instability
I'm not saying Google's SERPs are the apex of search. Even Googlers are aware that improvements are necessary. If anything, my personal opinion is there are problems with Google's search results, but instability of rankings is not one of them.

aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 9:24 pm on Nov 17, 2013 (gmt 0)

Well I don't have time right now to look up any percentage changes that Google announced during major rollouts over the past couple of years, but I'm pretty sure that some of them were considerably larger than 1%. Besides, I don't think anyone ever figured out how Google measured those percentages, especially since the effects often seems far larger to non-Google observers. In fact many were large enough that people here on this forum noticed them and reported them before Google announced them. Google itself said some of them were major innovations to the algorithm, gave them special names, and bragged about how much the SERPs were improved.

There's a lot more to this that could be said, but I just don't have time right now, so will try to get back to it tomorrow.

NegSeoExists0



 
Msg#: 4623820 posted 10:20 pm on Nov 17, 2013 (gmt 0)

@aristotle u are completely right

if the sites in top alexa 500k getting effected that is huge percentage

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 11:11 pm on Nov 17, 2013 (gmt 0)

SearchEngineWatch on Penguin 2.1 [searchenginewatch.com], Oct 4, 2013

...according to a tweet from Google's Distinguished Engineer Matt Cutts.

The first update to the second-generation Penguin algorithm designed to target web spam will affect "~1% of searches to a noticeable degree."

Google Penguin 2.0 went live on May 22 and affected 2.3 percent of English-U.S. queries.


Upon the original release of Penguin, Google's official blog stated:

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice.


So those are the facts. Panda affected 12% of queries "to a significant degree." Which means that whatever else was affected was affected to an insignificant to zero degree. Which means that at its most significant affect, 88% of the SERPs stayed the same. Upon subsequent updates the SERPs remained the same between 99.9% to 96.7%.

Those are the facts. The premise of this discussion is not established. Google's rankings are not unstable. I think it's important for the facts to be known.

aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 12:42 am on Nov 18, 2013 (gmt 0)

martinibuster
Well everyone is entitled to make their own interpretations and form their own opinions. As I said before, Google itself made a big deal about several of the new alterations and additions to their algorithm, especially Panda and Penguin, and bragged about how much the SERPs were improved. Why would they make such a big deal about it unless they thought the effects were significant?

As for the percentages, there's a lot of ways they can be calculated, and as far as I know, Google has never said how they do it. So there's no way to really know what 1% or 3% or 12% means.

But even 3% seems like a pretty big change to me. As I said earlier, stable rankings undergo SLOW GRADUAL changes and evolve slowly over time. A 3% chamge in one day is not slow and gradual, it's sudden and sharp. That's my opinion anyway, maybe yours is different.

But I'm going to abandon this thread at this point. I wanted to discuss what I think is a fundamental problem with Google's overall approach to their SERPs, but the thread has gone in another direction that doesn't much interest me. Maybe I'll get a chance to talk about that some other time.

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 1:22 am on Nov 18, 2013 (gmt 0)

This discussion is not over yet. Here's a rebuttal to my post.

Percentages don't tell the whole story
IF the majority of the algorithm changes affects commercial search queries, then the percentage of sites affected within the commercial group might be vastly higher. Thus, from the point of view of commercial sites, the SERPs might be viewed as unstable.

aakk9999

WebmasterWorld Administrator 5+ Year Member



 
Msg#: 4623820 posted 3:24 am on Nov 18, 2013 (gmt 0)

True, we do not know how the percentages are calculated.

Google themselves say that 15% of queries submitted have never been seen before [google.com...] , therefore it is obvious that it is not possible to measure algo changes on these queries. Hence the percentage of change given by Google would apply to no more than 85% of searches.

But given the sheer volume of daily searches, I think it is impossible to calculate the percentage of changes with any precision. I suspect these percentages are extrapolated from search control sets Google uses to test algo changes. I also suspect that when calculating these percentages, Google takes first x results to compare before/after rather than the whole query result set.

An additional complexity in calculating percentage change is personalisation and localisation - personalisation cannot be taken into account when calculating these percentages but localisation may be, however if it is, it adds complexity and a mass of data.

Has anyone thought that this constant tweaking of the algo is because Google is perhaps slowly losing the battle with the speed the web is growing?

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 3:55 am on Nov 18, 2013 (gmt 0)

this constant tweaking of the algo


Seems to me that the tweaking has at least three foci:
1. Understanding user intent (relevance)

2. Understanding concepts (relevance)

3. Fighting spam

We hear a lot about the spam fighting. Not so much about the relevance tweaking.

So, panda/penguin fight spam. Fighting spam does not make the SERPs any better. Right?

austtr

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4623820 posted 6:29 am on Nov 18, 2013 (gmt 0)

the tweaking has at least three foci:


I'd suggest that the move to "authority" might outweigh the others. Gifting top positions to big brands does not make for more relevant results, or a better interpretation of concept, it just locks in a lack of diversity. (There was a recent thread on just that subject)

Fighting spam does not make the SERPs any better. Right?


Spam is whatever Google deems it to be on any given day. One day it's "poor quality" pages that needed a Panda to remove them, then it's "unnatural link patterns/profiles" that need a Penguin to return order to the world, or maybe it's sites that dare to link to affiliate pages without a rel="nofollow".... or some other piffling infringement that sees sites condemned to a post Penguin oblivion.

I suspect that Google is under such heavy bombardment from spammers that they see themselves at war and as with most wars, its the non-participants who suffer the most losses. It's called collateral damage and I think the fact that Google was prepared to accept such a ground-swell of ill-will post Penguin gives us an insight into just how down and dirty this battle has become.

piatkow

WebmasterWorld Senior Member piatkow us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4623820 posted 11:35 am on Nov 18, 2013 (gmt 0)


The fact is that 99% of rankings are unchanged during most algorithmic changes.

And the other 1% make all the noise on sites like this whenever it happens. (insert your own preferred percentage from other posts). Most of the rest probably don't even realise that Panda et al ever happened.

JD_Toims

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4623820 posted 4:09 pm on Nov 18, 2013 (gmt 0)

Percentages don't tell the whole story
IF the majority of the algorithm changes affects commercial search queries, then the percentage of sites affected within the commercial group might be vastly higher. Thus, from the point of view of commercial sites, the SERPs might be viewed as unstable.

Also, we don't know the overlap of the SERPs the % applies to, meaning it could be two consecutive changes applied to a single "niche" or "query type", but otoh, although highly unlikely imo, there may be 0 overlap in any consecutive "minor" change.

If we look at what would happen with 500+ algo updates a year and non-overlapping changes [Meaning: Change B does not affect the same SERPs as Change A; Change C does not affect the same SERPs as Change A or B; etc.] and use 0.5% of SERPs affected to a significant degree as an "average", we get:

1.37 ave. changes per day * 0.5% of SERPs affected = 0.68% of SERPs change per day

0.68% * 10 = 6.8% of SERPs change every 10 days

Approx. every 17 days and 15 hours we see a 12% overall change (Panda Size)

Every 147 days there's a 100% change to the SERPs

I know I'm using "simple math" and not accounting for the 15% of queries being new in any way, but the point is without knowing the overlap of the changes or the exact % we can't determine overall stability based on the info we have, because the entire index could [in theory at least] change in less than 150 days, even with each update/adjustment only affecting 0.5% of the SERPs.

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4623820 posted 4:35 pm on Nov 18, 2013 (gmt 0)

Let's look at "instability" from a user's point of view.

From day to day or even from minute to minute, Google's order of results for [keyphrase] may vary slightly. Yesterday, John Doe's page was in position 4, and Jane Buck's page was in position 5. Today, their positions are reversed. Does that mean Google's search results are "unstable?" Even if it does, how likely am I to notice, unless I'm a site owner who's checking to see where I rank?

The only time "instability" is likely to matter (from a user's perspective) is when an expected result is nowhere to be found. Knock a Wikipedia or Amazon result from the top 10, and large numbers of people will be irritated. Banish Nick Nobody's result from the top 10, and nobody but Nick is likely to know or care.

Shepherd



 
Msg#: 4623820 posted 4:52 pm on Nov 18, 2013 (gmt 0)

and nobody but Nick is likely to know or care.


That's a great point that too many forget I think.


expected result


That's awesome! Here's a challenge: Next time we ask ourselves "how do I rank better?" lets instead ask "how does our site become an expected result?". That's gold right there.
Step 1. Get people's attention outside of google.
Step 2. Repeat step 1.

a. brand
b. authority
c. marketing
d. social

a+b+c+d = expected result

If people know you outside of google they would be more likely to expect to find you on google.

How many people are going to search for something today, see an Amazon result and say "hmm, what's this Amazon thing?"?

rish3



 
Msg#: 4623820 posted 7:13 pm on Nov 18, 2013 (gmt 0)

Matt Cutt's definition of "affected" is pretty squishy. He's indicated before that it only applies to results that are above the fold, and further hinted that "above the fold" means only the first 5 results or less.

He also doesn't ever specify if it's a percentage of "all queries" or "all unique queries". I suspect he's saying "all unique queries", which would understate the impact to highly popular queries.

JD_Toims

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4623820 posted 9:22 am on Nov 19, 2013 (gmt 0)

I think "thing number 4387" to remember is a while ago [back around when the crowding issue was more of a "hot topic"] there was an update that made "non-clicked" results less likely to show for subsequent queries/pages-in-SERPs during a session, so some of the "flux" people are reporting could be nothing more than "personalization based on result click/non-click patterns".

BTW: If people are really "buying into" the theory clearing your cache, cookies and searching through proxies "undo" every type of personalization possible, when location via any IP [even a proxy] and browser-sniffing [something like 85% were consistently identifiable in some fairly wide-scale testing a few years ago], plus query pattern/distance allow for some type of personalization, and even browser-sniffing can be 85% accurate in associating your query with "you" [or someone using your computer], then, please do yourself a favor and stop -- And, imo, it's fallacy to think the "promoted solution" of changing browsers and searching via proxy shows you the "real results", because the "real results" are what "real people" see and "real people" don't do those things, so the "real results" are always personalized to some extent, which means "having completely non-personalized results", doesn't tell you anything more than someone who likely hasn't ever searched on Google before is going to see, but that's not what anywhere near a majority of people see when Google has +60% market share *and* can personalize for most people *and* people do switch back-and-forth between search engines to some extent, meaning I think it's entirely possible imo 75% to 90% of Google's results are personalized in some way in the US where they only have a bit over 60% market share -- Oh, and if changing browsers and proxy IP could show you "real results" to see it for more than one query, you'd have switch to a *new* browser and *new* IP to ensure you weren't getting personalized results after every single query, so even if it did work you'd likely run out of browsers to check with before you made it through a day and almost certainly before you made it through a week, and even if you could get by all that through some "anonymising service" you still have to switch that service for every single query to not have personalized results based on query distance and click patterns.

And let's not even get into things like testing, hitting different data centers, or day parting and what time of day the queries people are checking are made to determine whether the results are "stable" or not.

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4623820 posted 12:26 pm on Nov 19, 2013 (gmt 0)

Wow, nice points JD_Toims! :)

I think you might have convinced me just how unstable the rankings really are! ;)

ken_b

WebmasterWorld Senior Member ken_b us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4623820 posted 2:19 pm on Nov 19, 2013 (gmt 0)

I've been following this thread thinking the serps aren't really as much unstable, as they are are variable.

I think JD_Toims post illustrates the point, for me anyhow, that among a pool of searchers using the same query the serps will vary and why. Sort of an expanded view of personalization.

.

This 40 message thread spans 2 pages: 40 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved