Forum Moderators: Robert Charlton & goodroi
- Tedster #:3699468 [webmasterworld.com...] - the most perplexing new SERP observations are those that report cycling, sine waves, yo-yo, rollercoaster, or pick your favorite synonym. sometimes these cycles happen down in the deep results pages after a url has dropped from page 1 - an apparent penalty. and sometimes the cycling appears on page one - from 3 to 10 to 3 to 10, day after day or week after week.I don't have a site under my auspices that is showing this effect, but I've been asked to look at few that are - and so far, I can say that the phenomenon is real, but am mystified by it. I felt this way when the -950 first appeared back in 2006 or so, and slowly some understanding of that has emerged. Sure hope we can get some understanding about the yo-yo phenomenon, too.
Are we seeing something new in how it's applied?
Is it a Google Glitch or intentional ?
Does it effect only site's in penalty situations?
Does it form part of new penalty handling procedures ?
Any more questions and suggestions ?
Tedster #:3708527 This is something that quite a few sites are reporting - and it often (always?) involves position #4 during the periods when the url is on the first page of the SERPs.This seems like it must be some kind of statistical testing to me. but if that's the case, how does a url get picked to be tested - and even more, how can it "pass" the test? Some urls have been on this Google yo-yo for weeks and weeks.
The yo-yo has afflicted sites that were regular fixtures on page #1. Maybe it is unusual fluctuations in backlinks that triggers the test - that's worth watching!
I'm watching a site that was penalised on May 31 & has been flying around on a key term from position 39 down to anywhere on page 7. None of the sites URL's for any previously ranked term appear above position 41.
Tedster had a theory about "let's see" and "test" , but I'm not sure that i understand what you think they may be testing.
3. For a site to maintain a natural high position it needs to a) be "trusted" and b)i think have a low bounce rate.
Iwonder about the "have a low bounce rate" part.
Some sites deliver the info a surfer is looking for on the landing page. Why most webmasters could easily set things up so that the surfer would need to look at 2 or 3 pages (at least) to get the same info, why should they?
Aren't surfers better served if all the info they seek can be, and is, efficiently delivered on the landing page?
what do you think?
If you can supply more details it would assist. It's not unusual [ i would think ] for a site to Yo Yo in the period before final demotion. What this thread is probably more focused on is on sites that stay in the SERP's , even though some have penalties applied at minus levels . Those that have disappeared completely are more likely to be under the auspices of the so called -950 penalty.
My suspicion is that your site breached some thresholds. First suspects are duplicate content & canonical issues [ complex ].
Second suspicion is links.
Then there's the timing sequence of things you may have done.
That kind of thing is most especially what I've been calling the Google yo-yo.
Yes, I do feel that some weakness in the backlink profile (particularly anchor text) can trigger this kind of split position, or ranking yo-yo. One case I know about is a high trust major corporation who did some SEO to try to rank for a competitive term that they had been ignoring. It's for that term that they yo-yo back and forth between, 4 and 14. However, even continued link development has not been a remedy so far in this case. Maybe the new links are too 'perfect' in some way.
I do suspect that this yo-yo ranking is some kind of a test, and if it is, eventually the url should either pass or fail. So what I've been waiting to hear about is a yo-yo ranking either (1) becoming stable, or (2) vanishing (at least doing a deep dive.) mrperfect4all above is such a report.
I do suspect that this yo-yo ranking is some kind of a test, and if it is, eventually the url should either pass or fail....
Complete speculation here. It may also be that Google is testing itself by getting different "views" of site performance, user satisfaction, Universal normalization, various algo-factors, human reviewers, etc over an extended period of time... and then seeing if they can find dependable correlations that will scale.
Previously, Google did various algo shakeouts in the fall and winter, and they received lots of complaints from Webmasters that they were messing up holiday shopping season. It could be that this year they're doing their major shuffling in, say, June, July, Aug, instead of Sept, Oct, Nov, messing up other seasons, but giving some merchants the holidays.
(Note that I say this hoping for some kind of stability at some point, though perhaps I should also be careful about what I wish for).
I'm seeing the yo-yo movement as an extension of what I was seeing with the -950 /end-of-results rankings, which often cycled, but over a much longer cycling time. It's as though they're now in a different phase of the same testing, or else they've just sped the whole thing up... and instead of going all the way from top to bottom, different pages seem to be moving around within certain upper and lower limits. As I've theorized before, a link or onpage change that pushes a page up or down enough might cause it to bump into the top or bottom hard enough to bounce, or go into a kind of unstable oscillation.
Optimizing a page now feels like walking a tight rope in many different dimensions... and you can't be either too perfect or not good enough. It may be that Google is trying to zero in on an acceptable range of optimization themselves by letting the rankings oscillate within top and bottom limits they feel it might deserve under different scenarios, and then observing reactions to the page in the serps.
The thought recurs that there's a user and reviewer satisfaction component to this... not as a ranking factor so much as a correlation factor. Again, I'm mixing observation with speculation here.
I only partially agree with comments that pages with the right trusted links will ultimately win out, because I know of several pages on sites that seem to be holding in place on very competitive searches in spite of no particularly linkworthy content or high quality links that I've seen... and no tricks either.
However, even continued link development has not been a remedy so far in this case
Then TEST something else....
My "solution" is a dumbed down version of the answer. It works enough of time (90%) that i feel confident enough in suggesting it to the general masses.
That does not mean it works in each and every instance
Nor am I going to go explain every single other "tweak" i've done to every single different page to make them "stick"
------------------
I only partially agree with comments that pages with the right trusted links will ultimately win out,
I only "partially agree" as well. (see above) Careful reading of my posts indicates that I'm leaving enough out to not give away $10k worth of information for free, while also pointing people in the right direction.
this yo-yo ranking is some kind of a test, and if it is, eventually the url should either pass or fail
Again, yes. So the EASY answer is to stop getting placed in the "testing range" of whatever is being tested. There are a billion ways to do this, but some are more effective for taking pages out of the "testing range" PERMANENTLY than others.
But if you continue to look at it as Goog testing your page specifically for something, then you'll be running around in circles forever.
It goes without saying that EVERY PAGE ON YOUR SITE should be continually optimized for the maximum click-thru rates, conversion rates, customer satisfaction, etc.
That argument is a lazy-SEO's explanation to NOT knowing what to do or having tested enough.
In one or two cases I've seen, there are one or two that *might* possibly make the grade (NASDAQ listed national brands), but they also fluctuate, no less than any of the others. I've also seen a correlation when Google lists their shopping links at #4. Those seem to be very volatile.
Nor am I going to go explain every single other "tweak" i've done to every single different page to make them "stick"
But what do you see as the specific "high level" elements that folks should be testing and tweaking ? Anything not already covered in the various threads.
I thought it would help the community to strengthen the fix on this problem.
.. but not all the Yo Yo problems are of this nature. Some are site wide and not so simple.
We are seeing identically created site structures with different content behaving differently.
There are differences in the TLD's , historical redirect patterns and backlink accumulation. It's not clear if this Yo Yo is a "one type - fit's all scenario" - i think not.
My guess is that it takes in a multitude of sins, however the principle of "testing" may be akin to sitewide testing. As you can imagine this is a little harder to test and get timely results from the experiments.
hmmmm ... one thing that hits me as a hunch from Whitenight's reports is that any site caught in the Yo Yo is indeed going through a "temporary loss of trust"
You either fix the problem it is citing , or you go south in a big way. ie "the site is on report" [ some others seem to indicate a quick wobble , then crash for more severe cases ].
... Any thoughts ?
[edited by: Whitey at 7:37 am (utc) on Aug. 18, 2008]
You either fix the problem it is citing , or you go south in a big way. ie "the site is on report" ( some others seem to indicate a quick wobble , then crash for more severe cases )
hmm. crash and burn.... lol
That's certainly a good excuse to have you test alot more, eh?
Site, page. Doesn't matter.
There's just want needs to be done to fix the problem.
We are seeing identically created site structures with different content behaving differently.
Ok, and so?
Obviously, one site could rock and the other suck.
Use one to test the other to at least find the MACRO issues that are effecting each one. And again, I highly doubt they have the exact same link profile anyways, so why does that rule out anything i've said earlier?
As you can imagine this is a little harder to test and get timely results from the experiments
You'll get results much faster now than any other time in Goog history. Or you can wait til you "fail the test" and then you'll have all eternity to test what the problem is....
One of those was on the universal search yo-yo, going from 4 to 14 during the day, every day. Looks like that site has now settled into a solid #15 with no more yo-yo. The yo-yo lasted at least a month, and probably more like two months, but I wasn't watching early on.
Interesting side note, the Universal Search position now seems to have moved to #7 for this query term. Now #15 is still better than this business used to do for this particular term, so really the yo-yo was a kind of opportunity that they didn't fully cash in on.
My overall sense right now is that yo-yo testing is not very widespread. It's real enough - but at least at the top of competitive SERPs, it's not something many businesses run into. Still, it can be disconcerting when it happens.
We don't normally think of this kind of volatility in the Google results. But Google does have the ability to change results by time of day or day of week -- and there are even some season of the year variables rolled into the mix. We may be seeing a bit more of this if Google finds it useful.
----
Another kind of yo-yo testing seems to involve "diversity" results - query disamibiguation and that kind of thing. In this case, the particular result I'm watching is a kind of typo. The real keyword spelling is a proper name, but there's another proper name that is spelled something like it. The yo-yo behavior seems to get tripped by publicity burstiness for the nearly-the-same spelling urls on other sites.
[edited by: tedster at 7:42 pm (utc) on Aug. 18, 2008]
It is now in its longest period of remision after a few small teaks some weeks ago. All I have changed is the number of items on a page from 100 - 50 (think like a classified adds type site) and made some tweaks to the db indexes as WMT was showing typical page load times of over 1500ms. A couple of days later and SERPS recovered a bit and have got better since. WMT now shows around 1000ms typical. It's not back to where it started, but is still climbing.
Now I don't think that page load speed is a single cause but I am starting to think it may be contributary, maybe a nuber of factors that add up to a 'user satisfaction' score / weighting and this has tipped the balance?
Of course its not over yet, I still check serps twice a day at the moment half expecting another vanishing act but hopefully it might just stick this time. At least I can now get back to content instead of firefighting. Now, where's my AJAX book got too.....
Google rotating sites into serps with very low off-site optimization.
How long will this wild trend continue? For our best keyword phrase, Google continues to rotate in some sites that are young, no backlinks, low PR, etc. While they are nice, content-based, article pages that appear to relate somewhat to the topic, their page and site strength seem very low. In other words, they've done some on-site oprimization but absolutely no off-site oprimization that I can see (and I can see a lot).
Meanwhile, Google has dropped other highly optimized sites (including mine, of course) to lower positions.
This flies in the face of conventional SEO wisdom on so many levels.
Your experiences... comments?
[edited by: tedster at 4:55 pm (utc) on Sep. 17, 2008]
[edit reason] moved from another location [/edit]
At the same time, my more stable ranking for the same page has gone from page 10 to page 3. I can see one of the other sites in the rotation doing a similar move. The Yo-yo has been going on for a week now.
* more intense and more frequent from 2008 if you compare them to previous years;
* all types of sites are subject to this kind of effect: portals, institutional websites, blogs
* are related to near-to-spam
* are related to networks of sites in the same machine or in the same network
* seems that this documents "Web Graph Similarity for Anomaly Detection" of 22 Jan 2008 is explaining a part of the causes
* can occour more frequently when tld domain is hosted in machine outside the country (for example: a .it domain hosted in Germany)
So my conclusion is that yoyos are tangible and destructive effects of an algorithm and not something may happens.
Of course I thank you for your patience and my poor english.
and I'd like to know what do you think about it.
[the yo-yo effect] can occour more frequently when tld domain is hosted in machine outside the country
Interesting idea. How many total examples does your data have?
are related to networks of sites in the same machine or in the same network
Maybe - how about this idea? The Yo-Yo might kick in when anchor text shows on domains that seem to be related. In other words, one of those web-graph anomalies shows up.
I've seen the yo-yo effect on sites whose total backlink profile was very large, and most definitely included a lot of "editorial links" from "high trust" domains. But the sites involved were trying to place strategic anchor text to rank well for a new phrase. Because of that, the new anchor text only appeared in a small number of domains and over a short period of time. That kind of anomaly might be detected and hit with a yo-yo effect.
The main thing I have discovered is any time I edit my home page it drops in rank by 100-200 points. I change my home page about twice a month to list new clients, otherwise my home page rarely changes and keyword density doesn't change either.
In case I overoptimized that phrase I removed several incidences of those two words (even from the title this last time--which I haven't changed in a few years). We'll see if that helps.
I forgot to mention the above does not affect traffic or ranking for other keywords.
[edited by: Lorel at 10:53 pm (utc) on Nov. 1, 2008]
[webmasterworld.com...]
In my opinion the yo-yo effect describes the 'straddling' of a particular 'trust threshold'.
A website or page toggles between two different positions, daily or several times per week.
My experiment was generally setup to test nav factors in rankings, but also to test crossing the trust barrier, and testing various factors related to crossing that barrier.
One element I noticed was that a particular combination of factors edited on the website (directly related to 'optimization' (increasing keyword density to excessive levels, adding keywords to each link in the nav) triggered the drop, but when the particular elements changes barely crossed the threshold, the website would toggle (yo-yo) - almost as if it was right on that line between being trusted and then being tossed out.
The experiment brought up questions I had, one of which was - is the Trust factor different on two DC's, as if one has newer algorithmic information and so was treating the page differently?
However, I quickly put this to rest when I returned all elements to the original condition, and very quickly the page / domain stabilized and did not yo-yo anymore.
I am still testing this. It reminds me of driving a fast car on a slippery surface. If you make minor changes, you slide left and right, but in general stay straight - but as soon as you 'crank' the wheel. the car gets out of control (possibly -950?)
Maybe the yo-yo effect describes a condition where Google has a built in 'leeway' for websites with previous rankings - as if you build up some trust so the merits of that are that you get a period of time where the trust is questioned, but you are not thrown down the long hall.
And let's face it, 99% of all webmasters would know when their website moved from position 4 to 11 for a popular keyword, and that might server as a signal to them.
It's interesting to see the Yo Yo also between 40 and 70 , but on different cycles between DC's.
I wonder if this filtering is purely automated or if it can involve a Google editor triggering it.
I have seen the yo-yo effect at various ranking spots now - between 5 and 11, 12 and 21, front page to second page (to the same positions each time)
From the testing that I have been doing, I can at least confirm that it can be triggered automatically. It is uncertain at this time as to whether an editor could trigger it.
My sense is that Google would prefer to adjust the algo to address websites that are at the 'threshold' of what Google believes is 'wringing too much water from stone' on a website.
Marcia - that is entirely plausible as well, and would certainly account for situations where webmasters are reporting the yo-yo effect without having made any changes to the website whatsoever.