Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Yo-Yo Effect - Observations and Understandings

         

Whitey

2:33 am on Jul 31, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There's some interesting discussions on another thread about the Yo Yo effect. Any chance of splitting this onto a new focused thread for better understanding :

- Tedster #:3699468 [webmasterworld.com...] - the most perplexing new SERP observations are those that report cycling, sine waves, yo-yo, rollercoaster, or pick your favorite synonym. sometimes these cycles happen down in the deep results pages after a url has dropped from page 1 - an apparent penalty. and sometimes the cycling appears on page one - from 3 to 10 to 3 to 10, day after day or week after week.

I don't have a site under my auspices that is showing this effect, but I've been asked to look at few that are - and so far, I can say that the phenomenon is real, but am mystified by it. I felt this way when the -950 first appeared back in 2006 or so, and slowly some understanding of that has emerged. Sure hope we can get some understanding about the yo-yo phenomenon, too.

Are we seeing something new in how it's applied?
Is it a Google Glitch or intentional ?
Does it effect only site's in penalty situations?
Does it form part of new penalty handling procedures ?
Any more questions and suggestions ?

Tedster #:3708527 This is something that quite a few sites are reporting - and it often (always?) involves position #4 during the periods when the url is on the first page of the SERPs.

This seems like it must be some kind of statistical testing to me. but if that's the case, how does a url get picked to be tested - and even more, how can it "pass" the test? Some urls have been on this Google yo-yo for weeks and weeks.

The yo-yo has afflicted sites that were regular fixtures on page #1. Maybe it is unusual fluctuations in backlinks that triggers the test - that's worth watching!

I'm watching a site that was penalised on May 31 & has been flying around on a key term from position 39 down to anywhere on page 7. None of the sites URL's for any previously ranked term appear above position 41.

Tedster had a theory about "let's see" and "test" , but I'm not sure that i understand what you think they may be testing.

tedster

7:17 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My sense is that Google would prefer to adjust the algo to address websites that are at the 'threshold'...

That's what Google has always said - and it makes a lot of sense for scalability over this huge data set.

Also, even the thresholds themselves could be reset dynamically based on future measures of the web, or input from Google's internal QA metrics. So a site could be over the threshold at one time and then the THRESHOLD moves. So rankings improve even though the site made no changes.

whitenight

7:57 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Glad this popped back up.

From what I'm seeing, sites that are within my "Ghost Datacenter"(quality editorial controls ala MC)
from the current update/test have also been those who were immune to yo-yo-ing (or escaped it permanently)

Very interesting.

tedster

8:12 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I know of one well-known international business who thought they were permanently immune. They learned this summer that they were not. I'm not sure that any website can count on immunity these days if they push too hard, or use tactics that are too blatant.

Seems to me that no matter how high the trust-o-meter is set, you can still get bonked.

whitenight

8:24 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sure, no one's ever permanently safe, but more importantly,
MC confirmed my suspicions that there is a
"quality signals incorporated" Dataset,
that consists of authority sites that automatically get a "pass" at this point in time.

That's my "Ghost Dataset", that seems to just instantly be incorporated in the SERPS
and includes the "beyond trusted authority" sites.
(Note: someone remarked in that thread that even .gov sites were "missing")

That's also the group of sites that I meant, earlier in this thread, avoids getting into the "yo-yo" situation
(didn't have a name for it yet)

tedster

8:31 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I got that - it reminds me of the old Inktomi approach.

CainIV

11:01 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"quality signals incorporated" Dataset,
that consists of authority sites that automatically get a "pass" at this point in time."

This matches up well with what I was seeing during periodic points in the test. Particular websites in the top 5 of previous SERPs were zero'ed - literally past the 1000 mark - and then tested within the context of the Ghost set.

It was almost as if Google was also testing the validity of its own 'seed' websites, as well as the child seeds, applying a '2nd and 3rd level', and then reapplying this back across the entire index. I could see this quite clearly as obvious authority websites were added before the additional websites that had previous rankings, buy would not be considered authority.

At the same time, we seen snapshots of various elements of the overall algo as they were reapplied against the new set. This might explain why many documents ended up in the index that didn't make sense - perhaps those websites were within x degrees of the seed (or 2nd / 3rd level child sets) but had particular attributes that were re filtered and are now slowly moving out of the top rankings.

whitenight

11:14 pm on Nov 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



perhaps those websites were within x degrees of the seed (or 2nd / 3rd level child sets) but had particular attributes that were re filtered and are now slowly moving out of the top rankings.

Nice! Yes! That's exactly what they are now that you put it that way.
I couldn't put my finger on some of the 2nd and 3rd "seed" pages exactly.
Very powerful data there.
Great analysis.

misterjinx

11:48 pm on Nov 2, 2008 (gmt 0)

10+ Year Member



I have some cases of .it sites hosted outside italy.

A threshold ... yes I also thought to this interesting idea and in fact yo-yo is for high ranking/high traffic keywords.

But in Google's patents (filed and issued) I fully read there's no evidence (or something could intended as evidence) of yoyo effect.

What I see at the moment is that often yoyoing initially seems to follow a model like the Poisson distribution.

Marcia

12:57 am on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What I see at the moment is that often yoyoing initially seems to follow a model like the Poisson distribution.

Poisson distribution [childrensmercy.org]

What is a Poisson distribution?

The Poisson distribution arises when you count a number of events across time or over an area. You should think about the Poisson distribution for any situation that involves counting events. Some examples are:

* the number of Emergency Department visits by an infant during the first year of life,
* the number of pollen spores that imact on a slide in a pollen counting machine,
* the number of incidents of apna and bradycardia in a pre-term infant.
* The number of white blood cells found in a cubic centimeter of blood.

Sometimes, you will see the count represented as a rate, such as the number of deaths per year due to horse kicks, or the number of defects per square yard.


One of the patent applications out there fits the possibility of that model, or something closely akin to it, like a hand fits into a glove.

I mentioned that the rankings might change by a set time period or a set amount of traffic. I recently thought of a third possibility - a set number of impressions, whether their clicked on or not. A couple of my connections in the SEO world have mentioned that this wouldn't make sense. Well, that may be true but something certainly is happening. The -950 didn't "make sense" to us, but it happens.

According to how that patent application reads, it appears that your possibility could make a lot of sense.

[BTW, Inktomi's database containing whitelisted sites (back when they had dual databases) was referred to as "Best of the Web."]

CainIV

2:50 am on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hello Marcia. Great technical information, but we need to know how this distribution fits the model of SERPs as it relates to a website moving between two distinct points in ranking. Knowing that it occurs, and what that is called, is only step one :)

tedster

4:25 am on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



misterjinx, can you clear up for us in what way you see a Poisson distribution with the Yo-Yo effect? Is it in play with the way the phenomenon first appeared and then grew? The number of urls affected over time? The number of query terms affected over time? Or perhaps in the number of url-keyword pairs that are tagged for yo-yo treatment over time?

It wouldn't surprise me that something like a Poisson distribution could bee seen in a data-set this large. The ingredients are there - a large number of potential candidates for the event with a very small number of actual examples.

Still, I'm not sure what that can tell us that we can act on. It certainly could be the case that the yo-yo effect is an artifact of some kind of probablistic testing, some way of giving more urls some "time at the top" of a crowded SERP. But still, some factor must make a url-keyword pair a candidate, right?

Marcia

6:49 am on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Still, I'm not sure what that can tell us that we can act on. It certainly could be the case that the yo-yo effect is an artifact of some kind of probablistic testing, some way of giving more urls some "time at the top" of a crowded SERP. But still, some factor must make a url-keyword pair a candidate, right?

The question might be a two-fold horse race: whether it's factors related to scoring of the fluctuating sites themselves, or whether it's testing of Google's presentation of sites in the SERPs, including snippet generation and positional placement.

Either way, wouldn't it still go back to the sites themselves (aside from snippet generation and displays), with regard to bounce rates, relevancy and user satisfaction with the sites themselves?

CainIV

7:54 am on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But still, some factor must make a url-keyword pair a candidate, right?

One factor could be loss of trust in the amount of x where the url previously held a position of trust over a period of y.

When a website loses a position or two in high ranking SERPs, the measurement level of quality is very small, and there is usually a very small difference between the sum of all factors for site 1,2 and 3.

However, when excessive trust is lost in a short period of time, this could trigger the yo-yo. As opposed to dropping the page off the planet out to page 2 and beyond, the page could yo-yo' in and out for a period of time before finally falling off the map.

Only problem with this theory is that other webmasters have reported newer, less trusted websites yo-yo'ing, in which case it could be that the phenomenon is now built into Google as a preventative tool?

Another theory could be some kind of fluid dynamic trust scale based on user metrics as Marcia described. If it was measured and applied minutely enough, small changes in landing pages could theoretically trigger the effect in the SERPs.

Marcia

8:29 am on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>user metrics
>>If it was measured and applied minutely enough, small changes in landing pages could theoretically trigger the effect in the SERPs.

Or maybe no changes in landing pages could be a possibility, n'est pas? After drinking and digesting enough Google Kool-Aid, might not a conclusion be drawn that whatever they're measuring, whether it's site factors or SERPs statistics, what they're trying to gauge is user metrics, and trying to test and accommodate user preferences?

CainIV

7:08 pm on Nov 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



and trying to test and accommodate user preferences

Naturally, since that is the end goal for Google, regardless of how the algo deals with particular elements for each website.

misterjinx

10:50 pm on Nov 3, 2008 (gmt 0)

10+ Year Member



misterjinx, can you clear up for us in what way you see a Poisson distribution with the Yo-Yo effect? Is it in play with the way the phenomenon first appeared and then grew? The number of urls affected over time? The number of query terms affected over time? Or perhaps in the number of url-keyword pairs that are tagged for yo-yo treatment over time?

I mean in terms of ranking in SERP for specific (yoyoing) keyword in different Google datacenters monitored.

Please also note that fluctuations or yoyo, call them as you want, are more frequent AFTER the PhraseRank introduction and, as already noted before, very frequent starting from 2008.

But it's possible that this kind pf phenomena was active BEFORE these dates and only now we discover the effects.

Finally another observation studying two different site about the same argument, and related only by an internal link.
Well, from a certain date the first site started fluctuating losing traffic while the second one recover its ranking and traffic.

[edited by: tedster at 10:53 pm (utc) on Nov. 3, 2008]

Erku

5:25 am on Dec 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How long does the Yo-Yo Effect take?

[edited by: tedster at 6:25 am (utc) on Dec. 1, 2008]
[edit reason] moved from another location [/edit]

chelseaareback

2:06 pm on Dec 1, 2008 (gmt 0)

10+ Year Member



Erku

if you mean how long does it take to yo-yo back up again that I am sorry to say, in our experience it varies. Yo-yo'd 10 times this year - least took 4 days to correct - longest was about 4 weeks

Marcia

2:29 pm on Dec 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've just seen a site "yo-yo" up to #15-16 from languishing at down around 60-ish (and lower) for a long time, after it had been at #15-20 for a couple of years. It's unmistakeably because of an offending factor being removed from the site, and it's been holding fast at its current 16-ish position for a couple of weeks now.

It was an internal anchor text issue that I fought long and hard to have changed; and when it finally was, it took around a week (give or take a little, not 100% certain since I didn't check daily) to rebound.

Clarification:

Excessive anchor text had been added preceding the drop - and it was an interior page, not the homepage, that was affected.

Excessive internal anchor text started to appear to be an issue around the time of the Florida update, and IMHO it's one of the primary over-optimization factors to look at; I'm still convinced of it, and it's not at all uncommon among DIY SEO'ing, from what I've seen.

[edited by: Marcia at 2:39 pm (utc) on Dec. 1, 2008]

whitenight

4:02 pm on Dec 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you know i woke up this morning pondering the never-ending DNA structure of the universe and how history always repeats itself -- only it doesn't cause its at a higher octave of consciousness...

Marcia wrote:

It's possibly from excessive internal anchor alt text

whitenight wrote:

Marcia, if you're familiar with the page, then this would coincide with my theory. (and something that's easy to test)

Are we still on C# chord or are we just seeing the same C# from a new level of awareness?

g1smd

5:14 pm on Dec 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Back in about 2003 or 2004 I noticed some sites drop to ~40 to ~60 from the top few positions for a couple of days every few months.

Is this in any way related to that? It's so long ago, that I have absolutely no idea what I said about those observations at that time.

Whitey

10:10 pm on Dec 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just dropped this in , incase it helped to stabilise the focus in search of stronger understandings :

Yes, seoit I'm seeing the same. It works always this way:
if you drop for bluewidgets and bluewidgets is NOT a competitive term the drop will also affect ALL of your URLs, all of them will drop to similar positions. (some sort of -40).

There is a discussion about exactly this problem on the google webmaster help group (September), but no one from google commented. [webmasterworld.com...]

Some more interesting discussion exists on this thread.

Is this a core behavioural symptoms that everyone is seeing ?

Irrespective of the causes , I'm wondering if this is the "new way" [ aka approx late May / early June way ] of filtering "offending" sites.

sandboxsam

10:50 pm on Dec 2, 2008 (gmt 0)

10+ Year Member



Hi Marcia,

Could expand on what you mean by "Excessive Anchor Text"

"Excessive anchor text had been added preceding the drop"

Thanks

JS_Harris

4:32 am on Dec 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It involves position #4 because the top 3 are usually hand picked and enshrined up there.

When you get passed all the filters and manual adjustments made to your site without your knowledge it all boils down to the spread of internal pagerank and the anchor points of incoming pagerank.

Site dynamics dictate that you can fully control how high or how low specific pages rank but you can't make such changes without directly and proportionally affecting other pages on your site.

My advice, if you have an extremely popular page that is entrenched in spot #1 of search engines, begin funneling that pagerank to other pages you want to boost by linking to them. When you start to see the donor page waver from it's position... that's enough. You ONLY want enough "pagerank - authority - whatever you want to call it" on a page to bring it to top spot, more is a waste.

It's not really that simple, every change changes the overall dynamics of the entire site, but such micro changes will help you understand your site better. You planned your entire site layout before you built it right? don't go overboard with changes.

MarieN

9:57 pm on Dec 5, 2008 (gmt 0)

10+ Year Member



The frustrating part of this is that the yo-yo effect seem to kick in mostly for brand new optimized pages, but there are still a ton of old pages that rank very high that are overly optimized, has too many inbound links from unrelated sites, offer link exchanges and offer little value to the user. It would be nice to see Google do something about these pages.

tedster

11:43 pm on Dec 5, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



...the yo-yo effect seem to kick in mostly for brand new optimized pages

That's what I've seen too, even for new pages (and new query terms) for well-established brands. It looks like "the algo says this url ranks well for this term - but something about it seems worth testing, because it appeared out of the blue."

That's been my operating premise from the earliest examples I saw - but I still don't get how a url passes or fails the test period. I was astonished to see one url yo-yo between position 15 and 4, only to settle in after a couple months at #16. This particular site belongs to a company who is nearly synonymous with the query term offline, but they had lousy SEO for the phrase on their website. When they woke up and clarified things online, they got the yo-yo and failed the test. All I can think of is that their new backlinks with the specific anchor text were suspect.

Whitey

12:22 am on Dec 6, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When they woke up and clarified things online, they got the yo-yo and failed the test. All I can think of is that their new backlinks with the specific anchor text were suspect

Would you say the alteration was considered too aggressive ?

I wonder what G wants for the site to pass the test and re-appear.

misterjinx

2:25 pm on Dec 9, 2008 (gmt 0)

10+ Year Member



@Whitey Yo-yo is not related to the offending filter (the original is patent documented here [appft1.uspto.gov]). The patent is about ads but in the latest paragraph you canread it may be applied also to websites.

@tedster Yo-Yo effect affects not only brands.

In Italy in 6-7 Dec 2008 I was at a SEO congress and according to latest studies of the SEO I already mentioned in other posts (dechigno) and Yo-yo seems not to be related to a single factor.

Dechigno reports 7 interesting cases of different websites.
Different in size, in content, in topics.

According to his conclusions it seems yo-yo effect could be determined as ...
* a problem of content. You have a trusted domain for a topic and you try to use it to give ranking to another site of different topic;
* depending from a specific crawler scanning the website;
* a problem of near duplicates;
* a bug or issue in Google ...

tedster

5:35 pm on Dec 9, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You have a trusted domain for a topic and you try to use it to give ranking to another site of different topic;

That's the one that I see - the rest of the list seems a bit fuzzy to me. When does a near-duplicate problem trigger the yo-yo and not just a "similar pages" filter?

misterjinx

6:50 pm on Dec 9, 2008 (gmt 0)

10+ Year Member



tedster you see yo-yo when your rankings continue to switch from the first page in SERP to about position 150 and so on.

The SEO reported different cases and not all were caused by trusted domain.

The difference between the filter and yoyo is that with yoyo your pages are in SERP (and not filtered in Googles'supplemental results), but at lower ranking and sometimes are at top.
In some cases your page completely disappear.

This 207 message thread spans 7 pages: 207