Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

The "Minus Thirty" Penalty - part 3

#1 yesterday and #31 today

         

tedster

7:11 pm on Nov 18, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



< continued from: [webmasterworld.com...] >
< part one: [webmasterworld.com...] >

First thing I want to clarify is what this pheomenon looks like: your domain used to rank well for a number of searches, and now all those searchs show you at position #31, top of page 4. The very best test to discover if you are infected is this: do a search on your domain name itself - type example.com into the Google search box, a search where you naturally expect to be #1. If you have this particular penalty against you, then even that search will show you at position #31.

No other types of suspected penalties are relevant to this thread. If you are not showing #31 for a search on your domain name, then this discussion does not apply to your site.

This position #31 penalty is not at all widespread. I brought up the topic all over at Las Vegas PubCon this past week -- and I barely found anyone, even in this seriously hooked-up crowd, who had a clue what I was talking about. And for the few who did, it was because they read this thread, not because they're bumping into it on their sites or with their clients.

Adam commented a bit on google groups but said he would not comment more because of google secrets.

This seems to be the official comment from Google: no comment. Even with 25 Google employees in attendance at PubCon, no further comments could be heard. As I said, the crowd here had no attention for the topic either.

Although some who suffer this experience appear to be mystified, I sense that the majority have quite a good sense of what's happening - what past marketing approaches may have brought down wrath from Mountain View. It clearly IS associated with practices that were aimed directly at manipulating the Google SERPs, rather than honest marketing practices. Maybe the site owner doesn't know what someone else in the company did in their name, and maybe they're just dissembling.

It seems to me the position #31 penalty is a warning shot -- and a very unusual one at that, quite loud and low across the bow. I believe it will not be a long term feature of the way Google functions. I do not have any sense that new sites will be contracting Google Flu #31 in an ongoing fashion. One morning, not too far from now, we will wake up and not see this.

Until that morning, I think patience and good hygiene in online marketing are the way to go. Scour the Google Webmaster Guidelines, and demand full disclosure from all staff and third parties involved in online marketing/SEO.

[edited by: tedster at 3:49 pm (utc) on April 5, 2007]

avalanche101

10:43 am on Nov 28, 2006 (gmt 0)

10+ Year Member



Percentages,
I think your perception is a little out of sync.
Work with them, don't play against them, as the house always wins.

VNelson

6:33 pm on Nov 28, 2006 (gmt 0)

10+ Year Member



Tedster,
Important as duplicate content issues are for EVERY website, they are not the reason for a position 31 penalty -- or else there would be lot more of this going around than there has been

Like Gimp, your comment concerns me as well. This duplicate content issue is the only issue I can find that could be causing our penalty and other people on the forums have mentioned it as a possible cause. Why do you feel so confident that it's not the cause?

I too find it hard to believe it could cause the penalty given how many others would be affected. I wonder if the penalty only affects sites with PR of 5 or higher or something along those lines, thus limiting the numbers. In other words, there could be co-factors that keep it from affecting a larger number of sites.

Also like Gimp, we have resolved the development site issue as far as I understand, and still no change in Google. It was fixed about 2 weeks ago. Actually the only change I've seen in our results is worse - we are around 42 for domain.com now. I suppose it could just take time, or else Tedster is right that this isn't the cause and I'm back to the drawing board.

Here's why I think this is the only issue on our end:
1. We have numerous other sites with no penalty and with the same back-end construction and other similarities. None of the sites are duplicates of each other. Some have shared content for user purposes, and Matt Cutts of Google told me last Spring that he thought what we were doing with shared content was fine. The only thing different about the penalized site is that we had that development site (with exact and complete duplicate content) started about 2 months ago on a different domain. The other sites had a similar development site set up on their same domain. It was a fluke that this site was set up differently.

2. We have not done any links to bad neighborhoods or anything related to spammy linking. (Our few links are to very trusted organizations such as American Heart Association.) Of course, many junk sites link to us and some others scrape our content but I expect and have been assured that Google can tell whose content is original (this better be true!). And it doesn't seem logical for many reasons that they would penalize us for bad sites linking to us. There have been no large-scale linking campaigns at all. Everything has been gradual, natural, and for user purposes. Seriously, we don't have the staff for heavy-handed SEO.

Any thoughts? Thanks everyone!

tedster

6:35 pm on Nov 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I still have a number of supplementals for pages that I really don't want to be associated with my site going back nearly a year. They are proving quite hard to get rid of...

You cannot immediately "get rid of" the supplemental results -- Google will hold onto them for as long as they want to,even up to a year. But if you fix the underlying cause, and those urls are now handled appropriately (404, 301, whatever is right for you) the fact that they still show as supplemental will not hurt you - that's been the experience of many.

But supplemental problems, though they definitely should be fixed IMO, are not the root cause of the #31 penalty -- as is discussed above.

nippi

9:46 pm on Nov 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Let me explain my question in more detail Gimp.

How did links to bad neighborhoods get on your site, how did they get passed as worthy to be on your site, if you stood to gain nothing from a link exchange, they weren't affiliate links...

You just decided to whack up a whole heap of damaging links on your site for no reason?

Um, of course I know how to add a link. I was trying to work out why the links had been added, what purpose they served. If not for an excahnge, then why weren't rehy rel=nofollowed?

[edited by: tedster at 11:25 pm (utc) on Nov. 28, 2006]

tedster

11:24 pm on Nov 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Gimp did say "they were put on my site by my people". That is not uncommon when staff is involved. The owner of the site doesn't always know everything that is going on, even though it is done in his name.

I also would suggest people definitively verify their outbound links by visiting the site, not just by trusting a "broken link" report or trusting their memory that the target was a good site. Domains do get sold, expire, and sometimes change drastically. This fall I found several links that originally went to medical authority domains that are now more about playing doctor than being a doctor.

Nick0r

12:23 am on Nov 29, 2006 (gmt 0)

10+ Year Member



tedster, and in the case of forums? Many people link out - should every link have a nofollow attached to it?

nuthin

1:40 am on Nov 29, 2006 (gmt 0)

10+ Year Member Top Contributors Of The Month



Unfortunately one of my sites got hit with this or something similar today.

Errm, what to do from here. :/ Time to read this thread fully me thinks!

tedster

1:50 am on Nov 29, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



tedster, and in the case of forums? Many people link out - should every link have a nofollow attached to it?

That's one good approach -- and there are others. Live moderation, nofollow attributes, scripted redirects are all possible approaches. Leaving a forum (or blog comments) open to unmonitorerd linking is not a good idea if you want search engine traffic. You've noticed how closely we police links here, right?

nuthin

2:06 am on Nov 29, 2006 (gmt 0)

10+ Year Member Top Contributors Of The Month



We have seen this type of penalty or filtering happening on a few sites we own before. What usually happens is a time-delay penalty and they come back in about 3 months time.

All be it, most of these never do as well as they did before the penalty took place.

AustrianOak

5:08 am on Nov 29, 2006 (gmt 0)

10+ Year Member



nuthin, going on 8 months on no expiry. Some are near a year.

What changes did you make as they do not expire automatically?

nuthin

5:46 am on Nov 29, 2006 (gmt 0)

10+ Year Member Top Contributors Of The Month



Suprisingly we did no changes at all to the sites that were getting filtered under some of the keywords that the pages targeted. Which is leading me to believe that they are time-stamped by Googlebot. ;)

I have a feeling they came back when a major update at Google happened. These happen from time to time, so 8 months for you might be OK.. only if Google hasn't done a major update.

I have noticed a few competitors sites that were getting filtered out are now back this week. This concides with Google's updating that they have done this week -- which for me personally is seen as a major one with a release of the sandbox and release of a few sites that have been getting filtered out under certain keywords.

AustrianOak

5:51 am on Nov 29, 2006 (gmt 0)

10+ Year Member



nuthin, can you confirm that you were under the "minus 30" penalty? (where domain and all keywords rank in spot #31 or beyond)

Not just a simple shifting of rankings due to google flux.

ALSO, did you happen to check if your competitors that are shifting were under the minus thirty penalty also?

[edited by: AustrianOak at 5:52 am (utc) on Nov. 29, 2006]

nuthin

5:56 am on Nov 29, 2006 (gmt 0)

10+ Year Member Top Contributors Of The Month



Definately under that penalty &/or filter with my personal local business directory that I run. This happened today.

I have been through this though with at least 3 other company web sites in the past and have noticed that they all come back at almost the same time length. Which has lead me to believe they slap a time-delay penalty depending on what they find.

In almost all cases when they do come back, they don't do "as good" with new content/pages. So yes, I would be worried if some of you's have been effected by this.

Gimp

6:53 am on Nov 29, 2006 (gmt 0)

10+ Year Member



Nippi,

You appear to be a one man band.

I am two levels away from the technical staff. And in many cases technical work has been, since 1995 outsourced.

I leave it to experts like you to do what is right. They obviously did not do the job.

And I leave it to others to supervise. And they obviously did not do the job.

I am into this now only because I see a problem that requires my attention to get solved.

A factual situation as to what happened under specific conditions was presented so that people could have data.

Why a link was made and how it was made has no bearing on the theme of this thread. It would not change the results. So I recommend that you not encourage people to waste time asking why links were made.

In the past year or so what was ok is no longer ok. Google changed. So things must change.

I gave a checklist of what I did to try to help those struggling.

The -30 penalty does not ask why. It sees what is and acts. Fix "it" or get penalized.

[edited by: tedster at 7:19 am (utc) on Nov. 29, 2006]

Nick0r

8:43 am on Nov 29, 2006 (gmt 0)

10+ Year Member



That's one good approach -- and there are others. Live moderation, nofollow attributes, scripted redirects are all possible approaches. Leaving a forum (or blog comments) open to unmonitorerd linking is not a good idea if you want search engine traffic. You've noticed how closely we police links here, right?
Yea, interesting. The only problem I see with the rel=nofollow is that there would be a lack of outbound links on a global basis.

caryl

2:36 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



I agree NickOr!

Think about what is being proposed here...

NOW when solid linking is of vital importance, not only for PR BUT just to stay out of "supplimental" HELL, it is being suggested that we "secretly" screw our link partners by adding code to make our outbound links impotent.

I can see it now... countless new threads about sites disappearing from the serps, pages disappearing from the index, mysterious ponderings as to why ones pages have gone from being PR5 to PR2 when nothing else "seems" to have changed.

Maybe a more rational discussion would be in order rather than this "sky is falling", fear based, knee-jerk, reaction to yet another one of God Google's inept attempts to rein in the internet.

...Just MHO
Caryl

Alex70

3:09 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



IMHO
in the past years there was an abuse of reciprocal/not-reciprocal/triangular/etc links exchange. Now google might discourage such behaviour filtering some website
that partecipate massively in such practice. Personally, I think this is an obvious and reasonable step forward to have a better serp's qaulity.

caryl

3:16 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Now on to what I really wanted to add to this thread...

I have seen it mentioned several times previously here, that this is a "rare" penalty. It may OR may NOT in fact be.

I for one have not even bothered to read this thread, until yesterday, because the title led me to believe it did not apply to me.

BUT - frustration and desperation finally got me to take a look.

I now believe that I have also been a "victim" of this "penalty" since April 26th as well!

One thing that made it hard for me to identify with is that my effected site did not target specific keywords. So, there was no way to identify going from a certain position in a keyword search to position #31.

I lost 90% of my traffic on April 26th and have been making a plethora of changes/fixes in an attempt to, once again, fall into the good graces of the Google Gods. But, alas, to no avail.

Yesterday, while pouring over this thread, I decided that I at least could do a search for my.website.com to see how my site would fare and lo and behold, my site fell anywhere from position #23 to #26 across the datacenters.

Now, I could be wrong, but I think I have enough of the "symptoms" to qualify for this dreaded "plague".

So, it only has taken me 7 months to figure this out. I am certain that there are others who may be too shy to post, many who can not identify the symptoms, or many who have simply just given up at this point.

I am really not a prolific poster, but I want to thank all of you who do, from the bottom of my heart, for all of the insights and direction you have shared over these many months!

Caryl

caryl

3:31 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Alex70,

I do not totally disagree. What I disagree with is the METHOD.

If you choose to back out of certain previous link arrangements, then do it above board and REMOVE the GD LINKS! So at least your previous linking partners will be able to tell that you choose to no longer participate in a previous arrangement.

They, in turn, will eventually be able to see that they need to persue other avenues for links.

To simply add "nofollow" tags to the code of your pages is both underhanded and undermining to your fellow webmasters, who have previously traded links, IN GOOD FAITH, with with you.

Caryl

dangerman

3:35 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Caryl

That is the same day that I and many others got thwacked. There is no doubt I have the -31 penalty. And more evidence to support that this was not a manually applied penalty, not that I ever supposed it was.
Google lowered the bar that day and many sites had the slide-rule applied to their transgressions. No matter how many hail marys you have made since April 26th (or site clean-ups) it seems that you have got to be an angel to get back into the G temple.

I don't hard evidence but I suspect that once you, are out you are out until they take an individual review. I wish there were more success stories for us to learn from, but they are rare birds indeed. I know others have mentioned Gimp, but one swallow makes not a spring.

theBear

3:37 pm on Nov 29, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



NickOr,

Forums are a major PITA to operate. The rel="nofollow" is but one way to quell the link spam that exists all over the web.

Even regular links are a PITA as they must be audited on a regular basis to prevent linking to bad places.

I think a number of folks in this thread have discovered some of that the hard way.

There are others that say nope that's not my problem, well have they checked with their eyes open or closed?

The thing about this line of work is that your mistakes with anything dealing with your site(s) can and will eventually bite you where you don't want to be bit, namely the wallet.

TravelMan

4:13 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Whilst these dupe issues are undoubtedly a factor of sorts, I also think that we would be foolish to discount the possibilty that the eval team data may be being factored in in some way.

Assuming that they have a lot of people evaluating competitive serps (i and others believe they do), a search engine could for example, set a bar of say, 10 negative marks from individual reviewers and then Kaboom you are out, or marked down considerably in the next data refresh.

The data from the evaluators would then be folded into the algo during the next data push, giving your site its true position minus 30 for example.

IOW, if an arbitary number of people all agree that a domain falls below a given set of quality criteria and they all mark it so, then no matter what the site owner does to it subsequently, it then becomes stuck in quicksand. Its never re-evaluated either, because it no longer ranks for anything remotedly competitive, so the evaluators never get to see it again. It might well as be a 1 strike and you are history policy.

Its pretty slick really. None of the 10 or 20 people reviewing the domain know each other, they are al bound to secrecy, allwork in the confines of their homes, so on balance can all be relied upon to mak a fair and objective assesment.

I could be wrong of course. Maybe the reviewers do engage in periodic reviews of their axeman work, or maybe the reviewers anonymously and randomly review each other via a handful of samples, but I tend to veer towards the 'that is unlikely to happen position'. Their workload would be too high for one.Who could really argue against an opinion that if the same people gave similar marks to similar sites then their scoring could be viewed to be fairly reliable overall.

Psychology is a marvellous tool too, and I'm sure the assumption of periodic supervisory reviews in the employee manual/contract would be a sufficient stick to keep reviewers in line and on message.

As for keeping site owners informed - Well, i think its common courtesy, but i guess they are quite happy to leave people chasing their tails. After all, if your aim is to reduce oreliminate incidences ofspam/unwelcome content in your index, then why inform someone they are toast; wouldn't this just encourage them to give up what they'd had incinerated and start afresh?

Maybe we can identify a common denominator here, I wonder if everybody who has been hit by this has a high incidence of eval.google.com referer strings in their logfiles..

Alex70

4:14 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



caryl

I'm not questioning the METHOD, which could be wrong indeed. I'm honestly tring to figure out what this filter might be. I do believe we should have a closer look at our IBL's and OBL's, because a bad practice is easy to spot and obviously oriented to the handling of the search result that's it.

caryl

4:32 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



I know others have mentioned Gimp, but one swallow makes not a spring.

I am not all together certain that there is even a way to be sure this was indeed a "swallow" - if you get my gist.

Over the years I have had other sites languish for months under one Google blunder or another only to re-surface again once "issues" (mine or Googles) were resolved.

This "phenomenon" has earmarks all it's own.

ONE, most notable, is the duration accompanied by little to NO change. Stability, has not been a strong suit of Googles for the better part of two years now. BUT, Google certainly has been steadfast in it's treatment of my "-30 site".

I am virtually religious when it comes to documenting DAILY the traffic (or lack there of) and # of indexed pages, etc. It is almost as if they (Google) have set a daily limit of the traffic they will send me.

I do have some pages on the site that do come up #1 for certain longtail searches. I have yet to figure out what makes thoses pages "special". They are NOT pages that have been modified in any way since April (i.e. Title/Description change).

This has truely had me baffled! The site went from hundreds ( > 1000) of Google hits per day to a trickle of ~30 to 40 hits per day (never any more than that). Frankly, I am perplexed as to why I get any hits from Google at all, at this point.

Caryl

caryl

4:47 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



TravelMan,

Maybe we can identify a common denominator here, I wonder if everybody who has been hit by this has a high incidence of eval.google.com referer strings in their logfiles..

As I have previously stated, YES!, I monitor referer strings DAILY.

I have NEVER seen "eval.google.com" in my logs for any of the 15 sites I record.

Caryl

PS - I have been doing this DAILY monitoring since early January this year.

dangerman

5:02 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Get yr gist on that. If indeed it has a nil survivor rate, I reiterate Walkman's earlier suggestion that it may be better to start afresh, however unpalatable that might be. The downside is the huge amount of site rebuilding work involved and of course it may be nigh impossible to replace the quality inward links. I have started along this road, but at the same time not giving up on Site 1.

Your case is interesting in that you suffer *most* of the same symptoms except that the -31 penalty is not ubiquitous over all your keywords. A different strain, but probably still bird flu! sorry to labour the analogy.

no-follow tags are a red-herring- this is not, imo, a factor. TravelMan: that's not an unrealistic theory, you make some good points. I am going off to look for eval.google.com referer strings..

Martin40

6:23 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



They know their algo's.......So how do we win here?

Good content and white hat SEO?

How did links to bad neighborhoods get on your site, how did they get passed as worthy to be on your site, if you stood to gain nothing from a link exchange, they weren't affiliate links...

You just decided to whack up a whole heap of damaging links on your site for no reason?


A newbie-SEO may not know about bad neighborhoods, plus....I added mine in 2004 when Google didn't seem to feel that strongly about this issue yet. That's why you have to stay informed on SEO trends. What's acceptable now may not be acceptable in 3 months.
And perhaps Google places higher demands on a site as it's PageRank grows. What may be allowed on a PR5 site may not be appreciated on a PR6 site. If that's the case then the quality of your links must grow with your PR.
Judging by recent updates, PageRank seems to be Google's main anti-spam tool, but it's effectivity is reduced when high PR sites start to link to bad neighborhoods.
Simply put, Google wants to impose it's methods on webmasters by expecting editorial responsibility from webmasters. Within itself this is quite acceptable, not because Google is a business and businesses have no responsibilty, but because no web user wants to see a spammed index. However, Google is a bit schizofrenic about what to say to webmasters and what not.
All the info you need can be found on MC's blog, but on the other hand webmasters are stonewalled during major SERPs upsets. I have to assume it's not Google's policy to alienate webmasters, but it's rather their lack of a coherent public relations strategy that is making SEO unnecesarily difficult.

[edited by: Martin40 at 6:50 pm (utc) on Nov. 29, 2006]

avalanche101

7:13 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Hi all,
it is not true that this penalty is permenant.
I have seen a site go from -31 in a search for its domain and a search for all its keywords, go back to no1 for its domain name and is now climbing up in a search for its keywords.

Further, I think this has a lot to do with bad neighbourhoods.
Check all he site you have exchanged linsk with, check them by doing a search for their URL and see if it is coming up at -31.

[edited by: tedster at 8:42 pm (utc) on Nov. 29, 2006]

dangerman

8:52 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Well that is good news then. maybe there is Hope.

Further, I think this has a lot to do with bad neighbourhoods.
Check all he site you have exchanged linsk with, check them by doing a search for their URL and see if it is coming up at -31.

But not in this direction for me. I maybe exchanged 5 or 6 links in total; these are all ok. However many dodgy sites linked to me over the last year including a complete site rip-off that meant filing a DMCA complaint.

avalanche101

9:03 pm on Nov 29, 2006 (gmt 0)

10+ Year Member



Dangerman,
do a search for their domain are any of them coming up less than no1 for such a search?
Who are they linking to?
Duplicate content issues?
Over use of H tags etc?
Has anyone scrapped your site: copyscape check.
This 183 message thread spans 7 pages: 183