Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Backs Out of their Position #6 Mistake

         

tedster

8:42 pm on Jan 29, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well, now it's official. Back in December, Google made a mistake that sent many #1 rankings to position #6. The first reports here [webmasterworld.com] were greeted with some skepticism around the web, and Matt Cutts commented that he "was not aware" of anything in the algo that would create this effect.

As reported by rustybrick below, Matt Cutts and Google now do understand what happened and they are backing out of it, rolling out the changes through their many data centers.

And this brings up a bigger - and for me, even better - question. What were they trying to do? Is Google now continuing to do it, but without "the mistake"? How much "forced position" work does Google do in the top positions, especially now that Universal Search is the rule of the day?

We begin with observations from our community, as the signs of the rollback began to appear a few days ago:

[edited by: tedster at 9:00 pm (utc) on Jan. 29, 2008]

steveb

7:14 am on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Being deliberately knocked down one spot is to be penalized. It's just a mild penalty. This obviously wasn't a filter since nothing was removed. Pages were moved down, not out.

Since it was a "mistake" though, penalty isn't the right word any more than dropping a glass onto a tile floor is "penalizing" the glass. Penalities are deliberate.

As bad stuff goes though, this was almost a joke from the get go, but considering there are some terms that are worth $10k+ a day for #1 and maybe #4k a day for #2, any amount of "penalization" can matter a lot.

Google first jumbled up searches that were prone to universal search results last summer, and a different sort of jumble could have happened here. Often for certain terms the universal news result is pinned at a certain spot (like #4 or #10). This placement of widget sellers at #6 could be a haywire attempt to put a froogle-ish result at #6 for widget-y terms.

"business can expect that when I've given it to them for several years"

Only if they are complete doofuses. In this day and age, no one can assume anything with Google. It simply has too many flaws to be thought of in such a reliable way.

tedster

7:21 am on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



a different sort of jumble could have happened here

An excellent name - the "position #6 jumble". It's not a penalty, and it's not a filter - it's really a "jumble". I like it!

Robert Charlton

8:26 am on Jan 30, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Universal Search entries now appear on several of these searches where there were none before.

Almost parallel with the rollback on this #6 ranking shift (or whatever you want to call it), there have been some major new manifestations of Universal Search. I've continued to see those news stories in position #4, and we've got those local map results [webmasterworld.com] up at the top, which seem to shift in response to user intent and the degree of disambiguation, and not always in the direction you'd expect....

I'm also noting on some of the travel results that when you add modifiers to certain searches, Google will display several organic results, usually three, above the map and list of ten.

On one search I tried, though, searching without the modifier brought three results for authoritative guides to the subject up above the map... and adding the modifier apparently disambiguated the search and a map came up to the top.

I've long thought that Google has different thresholds for different parts of its algo (like Hilltop) to kick in, based on how competitive a search might be. But "how competitive" is a vague term, which might relate to search volume, or the number of pages satisfying a query, or the degree of optimization, or all of these. In general, though, the more specific a search, the less competitive it was assumed to be.

In the local example I mentioned above, adding a modifier shifted the maps up and the organic results down.

As Marcia noted with regard to the "related to" suggestions she sees....

I think a guess that this was something phrase-based wouldn't be too far off the mark, and neither would a suspicion that there's "query expansion" based on word sense disambiguation be without foundation.

Bingo! I don't think it stops just at the "related to" links at the bottom, though. I think it may also be entering into the order of organic results.

One of the strange things that I've seen in this #6 ranking shift is that a page that worked its way up in the rankings to #1 for [big red widgets], and then also eventually came to rank #1 for [red widgets], would drop down for the three-word phrase, but still stay up at the top for the two-word phrase. You could micro-analyze inbound anchor text and onpage factors, but this to me suggests something phrase-related might be going on.

Position 1 has not been recovered in every case. I see #4 relatively commonly, or bouncing between #1 and #4.

I'm also seeing some of the prior #1 positions dropping to organic #4, so they appear right below those nice news story image thumbnails (which are adding an extra result above organic #4). Seems that the #4 position is one Google is looking at, both with regard to news and with regard to local maps.

vincevincevince

8:49 am on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it is now safe to assume that for many keywords, positions 1 to 5 are subject to a barrage of extra tests. The fact that anything could go so precisely to 6 shows a strongly ranking site which failed one of the new tests for 1-5. Sites didn't go to 7, 8 or 9. Just to 6.

Miamacs

3:23 pm on Jan 30, 2008 (gmt 0)

10+ Year Member



No time to write another lengthly post, so I'll be quoting stuff instead.

Warning, some serious jumping to conclusions here, although I won't claim I reached one. heh, for all I know this is but an idea, and it might be WAY off, so consider the post just trying to stimulate the thread *grin*

some of the below stuff is evident I guess.

Remember this on SearchengineLand? An Insider's View of Google Universal Search. [searchengineland.com]

'k, read it again. Or just jump to that nice yellow chart.

Ask yourself ( especially knowing the screenshots of the AdWords application [blogoscoped.com] ) whether they also have 'verticals' for themes ( categories ) which they *could* combine using a variety of different emphasis when assembing the 'web search' results. When you've stopped guessing just visit this page [google.com]. note: this is for categorizing AdSense publishers' pages for AdWords users - to target by theme. nowhere did I find the slightest official indication of them using it on web search... but...

So they have a not so clear user intent with a phrase that's very competitive ( which in my book means 'competitive in AdWords' otherwise why would Google care one tiny bit ). They have the technology to tell which verticals it would fit ( sure they do, they have theme based targeting in AdWords/AdSense ). They can also tell what the CTR is for different verticals... ie. not the popularity of single URLs but rather... popularity of entire themes/sub categories. They can also see what the most common 'second' searches are and what additional words to the phrase would define which intent. ( remember the AOL leak? people start from broad to narrow - not that we didn't know that ).

They also have search volumes, trends...

...and now they have implemented Universal Search on these SERPs.

You know, the Universal Search that was MEANT to combine different verticals based on 'likely user intent'.

...

Thank you for your time,
I wish I've made at least some of you people confused...

This is what I said [webmasterworld.com] on the original thread about this probably being phrase based.

IF ( all-caps ) this is the case, AND ( all-caps ) your site can be categorized into multiple themes, ie. verticals... adding words that'd ought to be present along the 'big money phrase' can make you compete in the universal search environment.

On the other hand, if a site is ONE THEME only for this kinda categorization, and when user intent is unlikely to be commercial ( there's another aspect to who has what intent with re-ranking retail sites but *cough* uhm... we won't go there ) lets' assume Google didn't want to have pages with certain themes up at the top 3 / top 5 because MOST people weren't looking for them. I mean this would translate as "those who did look for them, will find them as long as they're top 10 anyway". It's only natural that well SEO'd sites ( with ROI on mind ) will be higher in the SERPs, even if they discuss a less broad aspect of a generic theme, right?

Say, for 'Skiing', the most common clicked results might fall into the vertical 'vacation destinations'. Let's use the impossible scenario that Marcia at one point had to optimize a skiing site and as expected, due to the above mentioned circumstances *smirk* the pages lack depth, they fail to discuss things that are usually associated with skiing. However Marcia does a great job in building links, and optimizing the page for 'skiing'. IF Universal Search touches the user intent factor, the site that was previously #1 for skiing could fall in rankings, even though CTR and bounce rates are OK, simply because Google finds the page lacking, compared to the word sets associated with the most popular themes. meaning they feel their users are missing info they were eager to find ( reading their minds as this is but a guess based on their overly generic query + AdWords data ), and make Marcia's skiing site slip, probably even below the Universal Search box... down to at least #4.

again, this is not some conclusion just 'where I'm at' with my guesswork here.

But lemme say that in my experience trust thresholds ( what newbies still call the sandbox ) are tied to AdWords competition and not search volume. The two are of course almost parallel.

...

*hehee*
[tinfoil hat off]

... was just playing with the thought. didn't mean it to take this long...

No time to write another lengthly post

*ack*

*runs off*

potentialgeek

5:10 pm on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Kudos to you, Tedster, for calling it (and continuing to believe) when there were many skeptics.

Now if only they'll admit they also made a mistake with the 950 penalty...!

On a side note, the Plex has also admitted mistakes in recent months with Adsense.

Too many big mistakes going on for a $500+/share corp!

Some of these top-level execs are in over their heads.

p/g

potentialgeek

5:12 pm on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> I think it is now safe to assume that for many keywords, positions 1 to 5 are subject to a barrage of extra tests.

Not a bad idea in principle.

p/g

europeforvisitors

5:15 pm on Jan 30, 2008 (gmt 0)



Too many big mistakes going on for a $500+/share corp!

Some of these top-level execs are in over their heads.

Call me a skeptic, but I doubt if the top-level execs are personally tweaking filters or fine-tuning algorithms.

marketingmagic

5:50 pm on Jan 30, 2008 (gmt 0)

10+ Year Member



man - too much blah blah in this thread - how about something helpful here? It's a waste of time to read through all the back and forth snipes, save that for personal email exchanges!

I want to read about possible theories, what others are seeing, etc... in a helpful positive manner. Stop it guys!

tedster

8:08 pm on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



calling it (and continuing to believe) when there were many skeptics

Yesteday I had a chance to look at another SEO's collection of data for rankings that changed in Decemeber. There was a MAJOR clump that had all gone to position #6 - far beyond expected statistical deviation.

Some of these tracked query terms were relatively far down the long tail. But even there, the statistics were very clear. These stats loooked at many hundreds of key search terms, taken from a sizable number of domains. Over 60% of the terms that went to lower positions in December went precisely to position #6.

It will be interesting to see how rankings for those same queries migrate now.

incrediBILL

8:50 pm on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I want to read about possible theories

Since Google's doing a rollback it's not worth worrying about unless it happens again in the future and Aaron Wall posted some potential changes to seemed to make a difference, we just don't know which ones did the trick...

trinorthlighting

9:12 pm on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Miamacs hit the nail right on the head; make your site "universal" get some depth to your sites. Do not have your content wrote just for a keyword, have it wrote to support other keywords as well that support your keyword. Some called it phrase based, some call it searcher behavior (What people are looking for) and if you cover all the criteria for those "universal phrases" you will fare very well.

We did a shotgun blast as well and all the new content and 750 new pages pulled us right out. Fortunately I have a large staff that can crank out some web pages when needed. December 26th Tedster announced it, I read it and identified our sites were hit. Fortunately, it is a slow time of the year and I had the majority of my staff working. We peeled off a new site we were building and worked on adding new content on the phrases that were affected. By the time I announced our findings we had about 500 new web pages live, crawled and even cached.

Can I tell you what we did exactly to pull out? No, but I can tell you all the content we added did pull us out and made our site a bit more "universal"

vincevincevince

11:58 pm on Jan 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Since Google's doing a rollback

If I were a gambling man, I'd put my money on them not doing a rollback but rather making the edges a bit more fuzzy so it's harder to see. This was done for a reason and I cannot imagine that the reason has disappeared just because the means of treatment got noticed. What can be learnt from this in its raw and unmasked state will be just as valuable after the 'roll back'.

trakkerguy

1:06 am on Jan 31, 2008 (gmt 0)

10+ Year Member



Excellent points Vince!

Miamacs

1:13 am on Jan 31, 2008 (gmt 0)

10+ Year Member



not doing a rollback but rather making the edges a bit more fuzzy

exactly my thoughts. The script was buggy because it sent everything to the same position, the #6 penalty box. ( yeah, sorry 'filterbox' just didn't sound right )

Who would have noticed this script if its effects wouldn't have been uniform?

They admitted their mistake... of letting us know.

...

steveb

2:48 am on Jan 31, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google hasn't done a rollback of anything in like five years. They change the sauce every day.

Why, when there is absolutely zero evidence to support the idea, would someone assert "this was done for a reason"? Why are the FUD-powered black helicopters warming up now?

Why, despite the mountains of evidence to the contrary, do people think Google NEVER EVER makes a mistake and that this buggy, rumbling-bumbling search engine is exactly precisely the thing they want it to be?

vincevincevince

3:39 am on Jan 31, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



would someone assert "this was done for a reason"?

Why would a company which depends upon the algorithm make changes to it without any reason to do so? The changes they make might not always help but there is certainly a reason behind it.

incrediBILL

3:58 am on Jan 31, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The changes they make might not always help but there is certainly a reason behind it.

Define help.

The changes might hurt some webmasters while at the same time helping other webmasters that moved up, so depending on who you asked it certainly helped someone except those whining they got moved down.

steveb

4:15 am on Jan 31, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Why would a company which depends upon the algorithm make changes to it without any reason to do so?"

C'mon. Why did Google change it's algorithm last year to allow all those hacked edu domains rise in the rankings?

Obviusly they did not. The idea that actions they take NEVER have unintended consequences is just not sensible.

vincevincevince

4:33 am on Jan 31, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The key is to look at the unintended consequence. A certain subset of sites moved to position six. From this we learn that whatever the intended consequences were, they involved either position six or (in my opinion more probably) positions one to five. Poor implementation of a change to the algorithm is unlikely to result in such a specific change without that specific underlying the implementation at some level.

I believe that what we've seen is a second 'due diligence' test. Essentially a mini-algorithm to provide a second check on certain results.

Pass one: find result set sites, using standard algorithm:

  • result ranking score = [current algo], sort high to low

    Pass two: for each of the first five results, apply due diligence test:

  • due diligence score = [due diligence algo]*a, require result > threshold

    Where a result in the first five under the due diligence test fails to meet the threshold, then that site is demoted out of results 1-5.

    The result is that results 1-5 have passed due diligence, and result 6 might well be a previously 1-5 ranking site which failed due diligence and is now demoted directly to slot 6. I guess that if two sites failed due diligence on one term, then one would be 6 and the next 7.

    As for what the due diligence test would be; I've not seen enough examples of affected sites to come up with a good answer. I would hazard that it is focused upon higher-level checks for spam and seeks to ask the question "did this site get this ranking by artificially manipulating the results?". Manually reviewed sites may be excepted.

    This new 'feature' cannot be seen as easily now and may actually be rolled back, however I'm sure it'll be revisiting us at some point in the near future in a modified form which is not so easy to see; most likely through the 1-5 range being more variable (e.g. 1-10 for competitive terms, 1-8 for less competitive, 1-5 for non-commercial, 1-3 for low-competition...)

  • steveb

    8:01 am on Jan 31, 2008 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    "I guess that if two sites failed due diligence on one term..."

    Falls apart right there, but more to the point...

    You are suggesting this was NEW. Why wouldn't this be just as likely to be something old? Google lowers the score of sites and pages all the time, why wouldn't someone assume this was cancerous version of one of Google's existing penalties, most obviously the -30?

    There is no evidence to suggest this was Google trying something new deliberately, and considerably more to suggest something in the sauce currently was mutated accidentally that was thought to be benign. A deodirant can is benign on its own, but if you add heat it explodes. If you didn't know that, how would you know to beware of it?

    Hissingsid

    10:22 am on Jan 31, 2008 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Hi,

    I have only one (2 word) term affected. That term:

    1. Has "Searches related to:" at the bottom of the results page.

    2. Is the biggest $ Adwords term by far in our market. In fact over half of the searches for things related to our product are for this two word term.

    3. We were at #1 for this term for 2 years before the Florida update and then have been at #1 since, until we were hit by this effect.

    4. Add another none joining word to it and we go back to #1 or #2.

    5. We have retained our top slots for our target 3 word terms.

    6. One of the words in the two word term was identified as causing the problem during the Florida aftermath. Googleguy looked at the issue and we went back to #1. I strongly believe/know that the issue was one of disambiguation.

    7. I've recently reported a few instances of inappropriate broad match Adwords ads appearing for this term.

    In my opinion they have targetted big volume terms and implemented some form of phrase based element into the algorithm.

    The term in question takes the form:

    Thing Service

    Thing is an potentially ambiguous word.

    Service can be applied to any number of different things.

    When you join Thing & Service together in a search term then the ammount of ambiguity is massively reduced. If they are looking at the phrase then this changes the affect of the balance between these two words and reduces the affect of the semantically close words which occur on some pages that used to help Google to disambiguate the "Thing" word.

    FWIW I think that this is a retrograde step as it is easier to spam. Before you had to have semantically rich pages on the subject, now you just have to have the phrase in the right places at the right degree of repitition and prominence and you will find yourself at the top.

    In theory.

    Cheers

    Sid

    [edited by: Hissingsid at 10:24 am (utc) on Jan. 31, 2008]

    AjiNIMC

    11:49 am on Jan 31, 2008 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Somewhat our story is similar to that of SID including the Florida aftermath. We are still at #5 (#6 in some of the data center). My concern is that a lot of new sites have also poped in which are not that credible.

    Aji

    ridgway

    4:28 pm on Jan 31, 2008 (gmt 0)

    10+ Year Member



    Sid,

    Question -- is it the same for the singular & plural two word term on your site? (realizing that without knowing the term in context, might not make sense)

    My experience is somewhat similar. We have 'related searches' for both the singular and plural (location widget / location widgets), but the #6 jumble only hit the plural.

    Google shows 800k competing pages for the singluar, 600k for the plural, but i am certain the plural draws higher ppc bids, FWIW.

    Once the #6 was lifted, we bounced right back to #1/#2 for the plural and singular, but that was short lived -- currently at #4/#5, with some questionable sites above us that are OBVIOUSLY paid link driven, with no attempt to hide it.

    I think whatever it is, it seems to still be in effect, but no longer with a hard #6 result -- the serps float a bit more.

    Kyle

    Miamacs

    4:48 pm on Jan 31, 2008 (gmt 0)

    10+ Year Member



    did anyone - apart of trinorth - read my reasoning two posts before [webmasterworld.com]?

    just ignore the overly modest 'disclaimers' and please do it.
    - yeah, I'm so modest it hurts.

    ...

    no, really, try applying the logic explained in there to your sites' content and rankings. don't worry, it's easier done than said.

    I want to know whether I give too much credit to Google's in-house communication. I mean these things are all technologies they actually *use*, it's just that they were developed for different departments.

    [edited by: Miamacs at 5:43 pm (utc) on Jan. 31, 2008]

    rekitty

    7:58 pm on Jan 31, 2008 (gmt 0)

    10+ Year Member



    Some of these tracked query terms were relatively far down the long tail. But even there, the statistics were very clear. These stats loooked at many hundreds of key search terms, taken from a sizable number of domains. Over 60% of the terms that went to lower positions in December went precisely to position #6.

    This is exactly what I saw in our statistics. We had dozens of terms (head to tail) that went precisely to position #6. They are now back in their previous ranges.

    The reminder from this "mistake" is Google has control knobs that can dramatically impact the traffic they send send to a site, and they are willing to twist the knobs willy nilly.

    Don't build a business that depends on Google traffic (organic or PPC).

    inactivist

    5:51 am on Feb 2, 2008 (gmt 0)

    10+ Year Member



    Don't build a business that depends on Google traffic (organic or PPC).

    Why limit this statement to Google? :D

    crobb305

    8:29 pm on Feb 2, 2008 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Are you guys seeing a slow return to normal (trickling back up) or a sudden reversal?

    tedster

    9:24 pm on Feb 2, 2008 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    In some cases, yes it's a sudden jump back. But in others it's more of a triclke back and not a full reversal to the previous #1. Still, it's better now than that sudden drop to #6. I haven't seen any cases stay at #6 or (shudder) drop any further.

    One case, for example, the url had been holding at #1 for more than a year. After being released from #6, it went first to #5, then to #4, and today it's at #2. In the case of this query, a new "Product search box" has appeared at the top, creating an "eleventh result".

    Another case went from a long time #1 to #6 and then directly to #2 in one jump. But still others have gone from #6 directly to #1.

    If the ranking shifts had behaved like this from the beginning, instead of sending so many urls to exactly #6, there would have been a lot fewer reports of something funny/new going on.

    [edited by: tedster at 12:50 am (utc) on Feb. 3, 2008]

    Whitey

    12:40 am on Feb 3, 2008 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



    http://sphinn.com/story/24687 - Matt Cutts :

    In general if you think a site might have a penalty (perhaps from past behavior) and you think the site is clean presently, you can do a reconsideration request in our webmaster console to ask Google to take another look at the site.

    This implies that sites dropped across the board. But it also implies Google was looking to select criteria that was intended to drop "some" sites.

    What are the things that Google is looking to drop sites for. This is what I'm interested to know, since it heralds a new twist in Google's filter intentions.

    This 102 message thread spans 4 pages: 102