Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

"Penalties" - when is a penalty not a penalty

         

Whitey

1:28 am on Aug 14, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Several years ago the threads here were awash with claims of site's being penalised , when all it was were 1000's of webmaster unable to understand that they had canonical / duplicate content issues that created a filtering of their sites out of the SERP's.

After 2 years of debate finally some glimpses of hope started to appear and g1smd and a few others got together and the issues were largely resolved. 10's of thousands of webmasters were a little wiser. It was never a penalty though.

Having just witnessed another miraculous return into the SERP's that was dinged several years ago , I'm wondering if 301 redirect tangles and the subsequent resolutions that come into play is another one of those " Google confusion" filters that webmasters have interpreted as a " penalty " .

So when is a penalty not a penalty ? And what have the last years of experiences taught us with sites that have returned mysteriously ?

tedster

5:32 pm on Aug 14, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks for bringing this up, Whitey. This topic is one of my focuses with any site owner.

1. Every drop in rankings is not a penalty. In fact, most are not. Most ranking drops I see are either an algo change or a technical problem on the website.

2. If rankings were never high, then low rankings are not a penalty. By definition, a penalty takes something away.

3. Duplicate content may cause ranking problems, or it can cause your preferred URL for the content to be hidden by a filter. But this is not a true penalty - as in a black mark on Google's back end.

Used to be that true Google penalties were often very harsh. Why was the old name "Reinclusion Request" and only changed recently to "Reconsideration Request"? Because a common penalty "in the old days" was simply to kick the domain completely out of the Google index. A site:example.com query returned NO results.

Whitey

12:33 am on Aug 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm going to throw out a " wild " statement and see what thoughts are out there. Clearly i don't know what goes on, but i do have a pretty strong hunch.

It seems to me that Google doesn't penalise any site , except for "gross violations" where the integrity of the results , or the overall methods of optimisation could get beyond the controls of Google. Google's "flagging" algorthimn , community and editors take care of this fairly well to weed out such sites.

Google's objective is to make sure that it promotes the most relevant content and that it keeps a level of respect for it's guidelines , so that it doesn't loose control of it's product.

Wild Statement : All this talk of " -50 " penalties , -950 penalties etc is nonsense.

Google is just reacting algorithmically to some of it's own confusion about the relevance and integrity of those sites and puts them into a holding pattern until the webmasters sort things out , or Google manages to overcome it's distrust.

Rarely , highly trusted / branded sites are effected , unless they demonstrate severe abuse. And then they are quickly reversed . A classic example of this was the BMW site a few years back.

The bottom line is, that webmasters who come to these forums are much more savvy than the average business owner or webmasters that don't come to these forums , and Google has to cater for this level of simplicity as well.

Does anyone dispute this ?

tedster

12:52 am on Aug 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Wild Statement : All this talk of " -50 " penalties , -950 penalties etc is nonsense.

Maybe it is nonsense today - but it definitely wasn't when the following discussion was held: Google -30 & 950 Penalties - brief summaries [webmasterworld.com]. And when the REAL -50 penalty first came up, it was pretty rare but also very real and obvious: the domain would rank exactly at position 51 for a query on [example.com] instead of at #1. And every previously strong ranking keyword for the domain fell the same way.

These days, such precise penalties are kind of sinking into the waves of the new Google. So I also think that naming your penalty by the number of positions you fell probably doesn't cut it any more.

Whitey

1:19 am on Aug 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



such precise penalties are kind of sinking into the waves of the new Google


I guess i struggle with semantics when you and most others use the word "penalties" - it's hard to find another word, when it's become part of our culture a catchall [ I understand ].

But i don't believe that these are penalties. These effects were more ambiguous than that.

I'd like to propose that they were " interim sandboxes"

This , in my view , plays into your assertion on the above linked thread that " thresholds " played a big part.

This was caused by a change or shift in the way Google scored a page, or a site had had itself reconfigured , for example. That site could well have been caught in some problem , such as Google couldn't handle multiple redirects in it's trust considerations , so it needed to let things settle down for a while.

As such i believe an interim " sandbox " may exist.

the waves of the new Google.
Tedster , can you throw some deeper insight into what you perceive this to be , in the context of this conversation.

tedster

3:14 am on Aug 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The older patents from Google actually mention two possible ways of assessing spam penalties: subtracting a set number (example: -50) and also multiplying by a percentage point (example: end-of-results penalty, aka -950). A penalty could be assessed for a certain period of time - and if you want to call that a "sandbox", you could -- but Google called it a penalty.

At PubCon Vegas many years ago, I was at the courtyard bar of the Renaissance hotel with Matt Cutts and he was describing how Google was assessing spam penalties - mostly manual at the time - and how they planned to automate more and more of the process, including the automated lifting of penalties.

The fact is that Google actively looks for methods of artificially boosting rankings, and if they go beyond a certain level, then Google knocks that ranking back down. And THEY call it a penalty. And I'm darned sure there's a record kept for a domain any time this happens.

Now not everyone who bangs into a ranking loss knows that they are spamming today. Some practices that people kept closely held a few years ago stop being effective - Google catches on. Then they get leaked into more public knowledge. People pick up that "advice" here and there, apply it without testing, and get penalized as a spammer.

Other sites just don't rank well. They never ranked well, so you can't assume that their low ranking is a penalty.

the waves of the new Google.

Tedster , can you throw some deeper insight into what you perceive this to be , in the context of this conversation.

If you check rankings within the Webmaster Tools report, you can see what positions your URLs get impressions for - and it's all over the place. The patterns can often appear wave-like. In addition, people who track lots of ranking data over time see wave-like patterns flowing through the SERPs as rankings churn. These are the kinds of waves that suggest functions you get in some areas of formal math - complex analysis, statistics, matrix integration and the like.

I don't have this nailed down, and I'd need to be near the top of the ranking team at Google to have enough information even to get close. But rankings today often seem to change in wave-like patterns. It's as though the algo is sampling the user results that various ranking formulas generate, and analyzing that user feedback in an automated fashion.

When these wave-like patterns were first noticed, some called it a yo-yo, some called it a sine wave, etc. But it's not as regular as a sine or a cycloid. Whatever is going on here, Google has become statistically very sophisticated. And that's what I refer to as "the waves of the new Google". Today's Google does not simply score URLs as if they had a checklist of factors and they were assigning points for various factors - and then the pages with most points win.

We're on the outside trying to look into a black box we call the ranking algo, trying to figure what goes on in there. And the mental model of a checklist is still a workable approximation a lot of the time.

But it looks to me like it isn't really the way things work. Not with real precision. It's more like the way many search engines worked years ago. Today there is statistical sampling, waves of ranking changes, and at least some automated application of statistical feedback from user data.

And in the midst of these ranking waves, there are website that trip some automated flag - and their previously strong rankings still do get penalties.

Whitey

3:57 am on Aug 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Are all significant factors involving site health sufficiently covered in WMT ?

Why not have some report back recognition held in the panel , say in the example case of multiple redirects which can cause havoc to Google and other areas of major problems.

e.g. you have done x , this confused us , or , it's OK