Forum Moderators: Robert Charlton & goodroi
- Tedster #:3699468 [webmasterworld.com...] - the most perplexing new SERP observations are those that report cycling, sine waves, yo-yo, rollercoaster, or pick your favorite synonym. sometimes these cycles happen down in the deep results pages after a url has dropped from page 1 - an apparent penalty. and sometimes the cycling appears on page one - from 3 to 10 to 3 to 10, day after day or week after week.I don't have a site under my auspices that is showing this effect, but I've been asked to look at few that are - and so far, I can say that the phenomenon is real, but am mystified by it. I felt this way when the -950 first appeared back in 2006 or so, and slowly some understanding of that has emerged. Sure hope we can get some understanding about the yo-yo phenomenon, too.
Are we seeing something new in how it's applied?
Is it a Google Glitch or intentional ?
Does it effect only site's in penalty situations?
Does it form part of new penalty handling procedures ?
Any more questions and suggestions ?
Tedster #:3708527 This is something that quite a few sites are reporting - and it often (always?) involves position #4 during the periods when the url is on the first page of the SERPs.This seems like it must be some kind of statistical testing to me. but if that's the case, how does a url get picked to be tested - and even more, how can it "pass" the test? Some urls have been on this Google yo-yo for weeks and weeks.
The yo-yo has afflicted sites that were regular fixtures on page #1. Maybe it is unusual fluctuations in backlinks that triggers the test - that's worth watching!
I'm watching a site that was penalised on May 31 & has been flying around on a key term from position 39 down to anywhere on page 7. None of the sites URL's for any previously ranked term appear above position 41.
Tedster had a theory about "let's see" and "test" , but I'm not sure that i understand what you think they may be testing.
Excessive internal anchor text started to appear to be an issue around the time of the Florida update, and IMHO it's one of the primary over-optimization factors to look at
It looks like "the algo says this url ranks well for this term - but something about it seems worth testing, because it appeared out of the blue."
Thought I'd revisit an earlier remark.
Do you anticipate this is a semi permanent testing thing & an indication of being less severe than a more permanent filtering position?
i hope no one published is before
Nowadays it's not clear what is/are the factor(s) that start a yo-yo effect. After analyzing different cases, we realize some common elements, like:
All kind of sites can fluctuate, such as blogs and websites of different dimensions (from small to big portals)
Problems of near duplicates
Problems of network sites in the same machine or in the same subnet network IP addresses
Problems of websites hosted in countries outside the TLD domain suffix
Backlinks with the same anchor repeated many times
Intensive activities of Google's spiders in the site
Yes i have problem of "Backlinks with the same anchor repeated many times"
but what can i do now as its now in many forums, blogs, directories, sites etc.
1. Over optimization of page - I am looking at this and trying to determine if a possibility while making some changes.
2. Excessive anchor text to home page. I will look at this one as well.
The real frustrating part is I truly am trying to build a great site only to see the competition win with the techniques that shouldn't work, ie large amounts of recip links and old content.
Yes, I do have a question... sorry. There is a blog on my site that I update several times a week. Also I add a fresh content to my site 1-2 times per month. Should I stop doing this? The sites that rank well don't seem to update much at all if any. I don't want to stop and have it appear that I am not consistent because I have been so for almost a year now. I went from a nobody to a nobody on page 2. I simply don't know what to do and I fear everything.
I notice the down-turns just about the same time I get tons of google alerts of scraped content. Poland has been very bad about this lately... just about to block the country entirely, but then they're just scrape google results and use that, which I'm seeing become more and more common.
On the same pages, link to a page on the site in the top navbar and in the left uber menu (that's twice). Then, throw in a footer link at the page bottom for good measure. Add to that a graphic at the beginning of the content section with an alt attribute. All with the same anchor text for the same page, giving 4 occurrences of identical anchor text on each page - on a lot of pages.
That's what I call excessive. And it was adding that graphic that triggered the -40 which persisted, but came off as soon as the graphic was removed. The footer link is also now gone from that site section.
There are other excessive internal linkage problems with that site, some involving that particular page and more than likely some that's negatively affecting other important second tier keyword phrases.
And it was adding that graphic that triggered the -40 which persisted, but came off as soon as the graphic was removed. The footer link is also now gone from that site section.
Interesting .... it was the adding of a graphic on a third party site with an "alt attribute" , pointing to our site that caused the same slide on a site we have back in late May08. There was no link text involved.
I notice that 4 repeat occurrences of the same anchor text didn't effect your site.
Even though yours was internal, I wonder if this graphics / alt attribute issue is part of a pattern that Google has a concern over. It may co-oincide with perceived advertising networks.
Unlike your example , the graphic was removed and we didn't return to the SERP's.
[edited by: Whitey at 12:02 am (utc) on Dec. 31, 2008]
The "bad" spider might be checking for duplicate, near to duplicate and thin content pages.
I can't believe Google doesn't know the result of this is sites going up and down, often going from thousand (or tens of thousands) hits to almost zero.
The question is: why would Google want this? What's the goal of yo-yoing a site?
Briefly, my imaginary construct (probably not the same as Ted's imaginary construct) is as follows:
There are a potentially infinite number of sets. Some of those sets are defined. Definition is done by profiling.
Not all sets are equal. They get folded into SERPS with different weighting.
Sets might not even be ranked internally using the same criteria as other sets.(I am trying to work out a way of testing or verifying this- suggestions welcome)
Now, any given page MIGHT fit definitional criteria for more than one set (at which point the word 'set' becomes misleading, so swap for partition). A page gets categorised and put in a partition. Then as google churns away at number crunching, another partition appears suitable. Google spiders and re-evaluates, and the partition MAY OR MAY NOT change.
As early posters (I think) on this thread state- escaping the Yo-Yo would therefore mean changing your profile. More backlinks would be number 1 choice for me, then anchor text on internals.
Think Chaos Theory and Phase Changes.
As to Why- partitions would allow a reduction in process overhead.
Candidates for their own partition:
Mega-authorities (seed sites)
Sandbox
-950
Social Networking, blogs, ecommerce and info sites seem to be treated differently from each other- both in terms of how they rank, and how they pass value to their downstream links.
I've been assuming that, in the case case of social media, Google works to capture bursts of interest and new trends -- but with information sites, they look more toward returning authoritative results of a more conventional nature, so longevity of the links becomes more important. Several Google patents have mentioned this kind of possible variation in time sensitivity.
[edited by: tedster at 2:26 pm (utc) on Jan. 10, 2009]
e.g. e-commerce is less "trustworthy" than an informational site purely because that's what they are , not how they are structured, leading to a lower threshold of SERP stability.
[ btw - I'm not questioning your remarks Tedster - just probing another angle / theory against Shaddows comment ]
Ted's observation about social media is exactly what I'm talking about in terms of downstream effects. Moreover, social media that RANKS poorly seems to punch above their weight in terms of LINK POWER.
Many, many ecommerce sites that rank page 1 have no business being there, if they were treated equally with info sites. I can't believe how thin some of my ecom competitors are, and are still right there on page 1. There is also a much higher tollerence for cross-site dup content.
Social media doesnt seem to have to rank well at all to pass value. Relevance seems more important than in other Link Juice scenarios.
Blogs need to have some tangible standing in order to be valuable, and boilerplate links seem even more negligible than in other places.
That said, I think the partitions (if they exist) are much more granular than simply those broad-brush definitions. And its within these tighter definitions that the yo-yo occurs. Is it a 'profesional' info site, or one made by your aunt down the road? They get treated differently (auntie gets more lenient treatment).
Borrowing some terminology...
Say your combination of hand written titles, good navigation and unique content and gives you a Triple-A rating. You get ranked with your peers, then folded in to SERPS.
Now, say G re-evalutes your sector and changes the criteria for Triple-A. You now may become ALT-A rated. Now, rather than lose a single place, you experience a big drop because you are getting folded into SERPS in a different way.
Further, if the Triple-A partition has a defined number of occupants, and the profile of those occupants are fed back in to determine the criteria for AAA-rating (to help define the bell-curve for each variable, for example), then the effect will be quite chaotic at the margin. The bell curve gets tweaked, some Alt-A's get promoted, some AAA relegated then the Triple-A criteria gets redefined. The new defintion means more promotion/relegation and the cycle starts again.
Your ranking becomes stable when EITHER:
1) Your factors change (you add more links or content)
2) The promotion/relegation cycle produces a Triple-A profile in which you are no longer at the margin (you will probably remain where you were, if nothing changed)
Other than Ted's original suggestion and some commentary by Whitenight at the time, this line of thought has developed in a vacuum, so some critical feedback, thoughts and opinions would be really welcome.
...The bell curve gets tweaked...
Yes. Follow this line of thought all the way thru.
I again can not emphasize enough about starting with the END RESULT in mind.
Your ranking becomes stable when
Everyone's goal - Rank Top 3 for given keywords.
Figure out how to do this FIRST and then the yo-yo becomes an annoying "result" of those who don't understand how to systemically accomplish the first part.
Many, many ecommerce sites that rank page 1 have no business being there, if they were treated equally with info sites. I can't believe how thin some of my ecom competitors are,
Ok? So how do YOU use this to YOUR advantage?
Shaddows,
You were one of the people who SAW the Ghost Datasets and rollout.
You simply need to put together the pieces starting with the END RESULT in mind. Once you do that, the yo-yo issue will fall into place almost "by accident"
And before there was only one keyword for which my site gets disappeared no doubt that keyword was sending me 50% of traffic from google but now full site is under yo-yo effect.
Im very tensed any one here who finds what can be reason for this and to get rid of it.
I was under the yo-yo effect for a while for a top, highly competitive KW in my category, and then my ranks dropped. My site was not new but entirely unoptimized before. I am now on page 2 for that keyword, which is a bit better than before the yo-yo. The good news is that I am now - after the yo-yo stopped - top 5 for a similar, equally important keyword so Google at least haven't penalized my site as far as I can gather.
Yes, this thread began with your post of last summer. That phenomena involves rankings going back and forth from page one (usually #4) to something lower and off page one. But the discussion has now morphed into discussing up and down rankings that are buried a lot deeper and never get up to a page 1 "test".
tibiritabara
Well most of my keywords were no one or 2nd
and what was problem with internal linking of ur site and how u redesigned them all If u can give us some details.
MarieN
Well my site have mostly pages which have same title bar and meta description as it is a script based site which script was designed in 2004.
U said now u r on 2 , have u redisgned site,urls or what steps u taken to get out of this yoyo
and whitey if u plz share ur experince as well what problem was there in ur client site and what u do for him
sandbox was detected as the first kind of penalization, I'm not sure.. 2003-2004? Nevermind.. few weeks later SEO community can detect the origin of this 'penalization'.. new websites don't ranking well in the fist 6 or 8 months.
after this situation Google Research team has improved the control of SERP in two directions, links buildings and content quality. They have produced several 'filters' which are running in different google data centers with some specific parameters (spanish index doesn't have the same kind of control that english index).
My hypothesis is that these parameters have more restrictions according to the index sizes. The larger is the size of the index are more strict parameters that are passed to the 'filters'
A stable website is OK for each of these 'filters' If your website have an strong and deep effect 'yo-yo' then... your website is not Ok for a high numbers of 'filters'. If your website is more stable.. but you can't get first place in relevant keywords or phrases and each 5 or 6 months is going to 'yo-yo', your website has some penalization.
fsmobilez
an important element we have to avoid search engines penalties is website's information architecture. my client was a terrible navigation layout. After we changed all these problems the website was penalized near to one year.. so if you have this problem... relax.
My hypothesis is that these parameters have more restrictions according to the index sizes. The larger is the size of the index are more strict parameters that are passed to the 'filters'
I believe this might be true.
I have 2 structurally identical sites, one is yo-yoing, the other is stable and always 1st page for targeted keywords (spot 1 or 2 most of the times).
Yo-yoing site has a lot more backlinks (quality ones) and is richer in content.
The main difference between the two sites is that the non-yoyoing has around 100 pages while the second has 2500.
It is crystal clear!
I have tried everything with that page.
I even de-optimized at the point that the keywords I'm targetting were not even 1% of the text and not in the h1
This page is strong as an ox and has been ranking #1 for those specific keywords for years until june 26th 2008 (the site is 12 years old)
Then it disappeared for a few monthes, than it came back but far from the 1st page, and now for at least 3 months it is back on page 1 but with this damn -5 penalty
Most of the time, I consider it as a sickness and I just live with it. Then sometimes, it makes me mad, so I change a word or two in the title or description or h1 and guess what ?
Google crawls it within a couple of hours and the page is back to #1...for the rest of the day, than, bam, back to #6
It never misses: always the same pattern.
I'm thinking of making a script that would display the page title, description and/or h1 with a slight variation for each day of the week...
The hell with Google and it's misterious penalties!
In that case, I wouldn't mind.
The thing is a came across this "minus 6 penalty" and since it is what is happening, i would like to understand.
[google.com...]