Welcome to WebmasterWorld Guest from 54.167.5.15

Message Too Old, No Replies

Is Google Using a Position #6 "Penalty"?

     

tedster

10:35 pm on Dec 26, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Something is happening that was mentioned in our December 2007 SERP Changes [webmasterworld.com] thread and deserves a dedicated thread.

What some site owners are reporting is that search rankings that have held for a long time, often at #1, were knocked down begun to #6. These reports happen often enough that it looks like there might be something specific going on. However, there are always ranking shifts, so zeroing in on exactly this one thing can be difficult.

-- Here are the main signs --

1. Well established site with a long history.
2. Long time good rankings for a big search term - usually #1
3. Other searches that returned the same url at #1 may also be sent to #6, but not all of them
4. Some reports of a #2 result going to #6.

-- What we can identify so far --

A. It's search term specific (usually the biggest and best converting phrase)
B. Therefore, not a url or domain-wide penalty on all terms
C. A little testing on one site seems to show it's not an on-page problem
D. That leaves off-page but on-site, or off-site, or posibly backlink issues

-- Some loose guesswork and brainstorming --

i. Backlink profiles are not diverse enough - is this a new algo tweak on that factor?
ii. Backlinks are aging or stagnating, with no new ones being added?
iii. I thought about the possibility of paid link devaluation (even going back two or three steps from the site) but that would not consistently place a url at #6, so I've ruled that out.

Is anyone else seeing this Position #6 problem? Something like this could be hard to separate out from all the other movement that the SERPs show.

However, I've now seen it happen to key terms on three different sites operated by the same person (different WhoIs, no incestuous linking) and two corporate sites. Plus there are several other reports in the Decemeber SERP Changes thread. Every one of these cases seems to be hitting the domain root, and not internal url.

I'm not happy with the current level of analysis, however, and definitely looking for more ideas.

[edited by: tedster at 6:28 pm (utc) on Dec. 29, 2007]

ChiefBottleWasher

10:58 pm on Dec 27, 2007 (gmt 0)

5+ Year Member



I meant more specifically that Google regards one way or reciprocal links as equivalent like for like.

ChiefBottleWasher

1:03 am on Dec 28, 2007 (gmt 0)

5+ Year Member



The salient discussion above digresses from what we're trying to agree here. I think that what I suggest makes sense, does anyone else have any on-topic observations. Do you agree with my theory that the six position could be a safety net tedster?

The observations about backlinks made by Katie look pretty definitive to me and inline with my own theories and I certainly would bet on this as a link issue, but wouldn't dismiss the click through idea although it seems a bit draconian.

I wasn't intending to stifle debate and I'm disappointed that this thread has gone quiet.

ChiefBottleWasher

1:06 am on Dec 28, 2007 (gmt 0)

5+ Year Member



In terms of user data we have site stickiness, clickthrough rate and that's about it. Maybe Google can monitor propensity to return to the site using the toolbar.

Do we really think that Google will start to rate user preferences above all the traditional SEO factors?

Marcia

2:00 am on Dec 28, 2007 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>>observations about backlinks made by Katie look pretty definitive to me

What observations? That backlinks from blogs are somehow involved?

JeremyL

3:25 am on Dec 28, 2007 (gmt 0)

10+ Year Member



@chief

Absolutely. A better question would be can we believe google is not trying to move beyond links which have become simple to manipulate. Its not if google is going to start using user data. They already have. I've seen it happen with new sites on a few occasions.

Marcia

3:50 am on Dec 28, 2007 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



They can also incorporate additional metrics for evaluating the relative value of links, which aside from whatever else is happening, I believe they're doing.

potentialgeek

4:57 am on Dec 28, 2007 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I think link valuation wrt reciprocal links etc. has been on the front burner at Google since mid- to late October.

This may be just another tweak/revision/test. At least it's "small," i.e., not a -950 (if indeed it is a penalty). Of course I wouldn't call a five-position SERP drop as "small" or insignificant. For me it can mean a traffic/revenue drop of 30-50%.

p/g

tedster

4:57 am on Dec 28, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Do you agree with my theory that the six position could be a safety net tedster?

It's worth keeping in mind. The idea that clustered results end up a 6+7 is only the natural result of the way clustering works - any two results from the same domain that occur on the same page of results will always cluster. It's a last minute action applied to the rough results.

I keep going back to an older theory I had but could never pin down about Google actions to enforce a specific position - in this case a ceiling rather than a safety net. There has long seemed to be a kind of barrier for going from #11 to #10. It just seems to take something more than any other upward shift of one single position.

I think Google is becoming ever more proactive in crafting the first page of results, and this position #6 sticky spot may be yet another action in that direction. It seems to me that "universal search" may require this, and even more, may be providing the infrastructure needed to play around with "forced positions".

And speaking of universal search, I noticed in December that the number images on the SERP really fell off. YouTube videos still have a "plus box" to click, but not a still image very often.

So, I'm conjecturing here, maybe these new "forced to #6" results are those previous #1 results that haven't been performing as well - on click throughs and click backs - as some expected norm would predict. So no matter what Google's relevance algo says the ranking should be, these urls get time down the ladder to see if there's a better candidate already on the page for making Google's users happier.

Also, some reports came in that the #6 position would appear on google.com but on aol.com, the url would still be #1. However, all those examples went away and aol.com is now in line with google.com, except of course the total number of results is lower, since aol apparently doesn't get the supplemental database.

There are always lots of other ranking shifts at any time - but this one is so specific, going from a #1 to #6, that I'm hoping we can come up with something useful here. Unfortunately, we don't have a way to get Google's click through and click back data, so I can't quite see how to test my latest idea.

Marcia

5:06 am on Dec 28, 2007 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Does anyone recall if there are any papers or patents published that make mention of a devaluing in the SERPs for a very specific reason (backlink related), which could be interpreted as meaning it's for a certain number of spots downward if the metric is met negatively?

I recall such a mention, but don't remember where I read it.

potentialgeek

12:06 pm on Dec 28, 2007 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Long time good rankings for a big search term - usually #1

I've often struggled with the freshness issue after getting top spot in SERPs. I usually lean towards, "If it ain't broke, don't fix it," especially when I have no idea whether a small tweak or big addition will collide with the latest knarly Google algo incantation and result in a -950, -30, or now maybe -5 penalty.

At the same time I've known freshness can count toward something, so I know that the blessing of top spot may not last forever.

I don't know how to find balance between the right amount of freshness on the one hand, and the right amount of stability on the other.

Google may be revising its algo to scrutinize sites at the top, "a fault-finding mission."

"Does it really deserve to be there?"

Double-Check for The Top Dog=Google Paranoia at SEO Gurus

p/g

This 164 message thread spans 17 pages: 164
 

Featured Threads

Hot Threads This Week

Hot Threads This Month