Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

The "Minus Thirty" Penalty?

#1 yesterday and #31 today

2:36 am on Oct 13, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 17, 2006
votes: 0

Hello everyone,

I just got my site rank #31 on its own domain name and bunch of keywords/phrases I usually watch were bumped from #1 to precisely #31. Those #2 through #10 are sort of all over the map but generally within the first 60 results.

Does anyone have some experience with this? What would the respectful audience here think a most likely reason for such penalty is? What do you suggest as the best strategy to fix this?

There has not been any major redesign recently, just routine adding pages here and there. Some unique, some syndicated industry-related content.

Thanks for any idea or comment!


6:50 pm on Oct 30, 2006 (gmt 0)

New User

10+ Year Member

joined:Sept 17, 2005
votes: 0

I knw where my company's site lacks... n if I was one the of Google's person to give -30 to sites for keywords in my sector...

I would have given the penalty to MY COMPANY"S SITE just like some guy at google did.... I am trying to be real. Thinking like general user.

But where my company's site lacks CANNOT be determined by any SERP or ALGO only humans can judge that...

Like I said they may find that keyword on the page with proper content but they will not find what exactly are they looking for... like real products to buy or images to download etc etc.

So all in all this is a MANUAL BAN... manual ban is unlikely to have any expiry date unless the person who banned the site monitors those sites for some time, to give them another chance....

At best u can do is correct all the mistakes... improve where the site lacks... & then file a re-inclusion request. Sending for re-inclusion without finding or correcting the cause will be of no use.

We Will be improving & filling re-inclusion request :)

But I'll say it once more... This is a Manual Process(Ban)... So next time u optimize ur site or write content... make it for people not for serps... just what google keeps on saying...


7:27 pm on Oct 30, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Jan 1, 2003
votes: 0

Good point guessme,

but that is the point for some of us. I've always created content for users not google. I only started to learn about seo once my site was penalized. I am not tooting my horn or exaggerating.. just being honest to this crisis.

I agree with you, those that deserve to be punished should be. But keeping it real, it's a slap in the face otherwise.

[edited by: AustrianOak at 7:58 pm (utc) on Oct. 30, 2006]

8:04 pm on Oct 30, 2006 (gmt 0)

New User

10+ Year Member

joined:Nov 10, 2005
votes: 0

My experience with the plus 30 penalty

I work for a PR 7 website that operates in a very competitive area. We have created a large number of pages for all of the categories and sub categories that we operate in. Some of these pages have unique content and some have duplicate content that repeats between a lot of the other different pages - the only different part is the title, meta descr and heading tag words.

I have found that all of the sub category pages that have duplicate content are #31 in Google whereas the ones that have unique content or lots of deep links are #1-5 in Google. All of the category pages with duplicate content, which are higher PR and more connected to the PR 7 home page are ranked #21 (plus 20 penalty). The unique category pages with inbound links and good content are ranked well.

I have recently made some change to make all of the sub category pages unique and am hoping that this will result in all of their ranks improving.

From my observations I can say a couple things with confidence:
1. The penalty is not site or domain wide, it applies to specific pages.
2. I believe it applies to pages, not keywords. My pages held at #31 across a whole index of queries that would have brought them up.
3. Lots of inbound links or other characteristics of a quality page will allow the page to not be filtered by the penalty.

My changes will be indexed soon and if there are any changes I will share my findings.


8:14 pm on Oct 30, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
votes: 22

Maybe there are just some amateurs reviewing sites who don't know exactly what they do?
8:33 pm on Oct 30, 2006 (gmt 0)

New User

10+ Year Member

joined:Nov 10, 2005
votes: 0

I just wanted to say that I do not believe it is a manual ban.

1. The eval.google.com thing is just for evaluating algorithms. The engineers make new algorithms, and then run them in parallel with the current system and use contract workers to judge whether the changes they have made have improved or worsened the results. In the end, an algorithm will still be evaluating your site.

2. Just because we cannot possibly think of how something could possibly be programatically judged to be spammy or of low quality, that does not mean it can't.

Anyway, I'm looking forward to seeing whether my pages are reincluded once they are re-evaluated. I'll keep you posted.

8:47 pm on Oct 30, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:June 16, 2006
votes: 0

Like I said they may find that keyword on the page with proper content but they will not find what exactly are they looking for... like real products to buy or images to download etc etc.

I'm not sure if I'm reading that correctly, but...

I'm willing to bet that an algo can in fact easily identify shopping cart directions, and most certainly <img> tags. Something like that can certainly be accounted for. Unless you're talking about the ability to identify 'who is this picture of'.

Also, if there is in fact a -30 penalty that is manually applied, it could be something as simple as -

Writing a script to list all the top 15 sites for a previously specified selection of search terms. (could be generated via another program, or by hand) Then, remove all sites that fall into #*$!x parameters. (could be shopping cart based, or whatever. Pick your poison) The ones that are left are used to fix the natural search... just in time for the shopping season I might add.

OR - It could be someone stating their claim like I did with google, on the fact that some people are ranking for terms where the user is clearly NOT looking for anything related to their site.

However, someone did say that they have a -30 to an entire directory... which would totally suck. Go back and see if your site ranks for any natural terms, especially ones like "Chicken information" or whatever. Just see if you can get in the top ten for at least something.

If not, then it's worse than speculated. More like a ban from the top ten, rather than an adjustment.

Other things might come into play. Scraped content (real big no-no recently), meta issues, linking profile, etc. Hundreds of things really. I'd take a real long hard look at your SEO practices before blaming google for wronging you.

[edited by: tedster at 9:16 pm (utc) on Oct. 30, 2006]

8:35 am on Oct 31, 2006 (gmt 0)

New User

10+ Year Member

joined:Oct 27, 2006
votes: 0

my site has penalty for all the directories , pages.. keywords.

But i sow i thing :
when the results for keywords are < 30 then the position is the right one , with no penalty.

I tested like this.

i enter a keyword from my site for a specific language and the total results = 24. I was on #4
i change the language and the total results > 30 and my site was on #31

8:42 am on Oct 31, 2006 (gmt 0)

New User

10+ Year Member

joined:Aug 10, 2005
votes: 0

Hey people, my site has just been out of the minus 30 penalty. I have been using link vault networks but never got penalized until now. I have a unique content 80,000 pages indexed website. I built about 20-25 affiliate link pages and removed as someone suggested in this post. I removed all link vault and DP coop code and filed a reinclusion request. Now, within less than a week, my site is back up to the same position where it was 2 months ago... I guess it was kinda related to over excessive anchor text and now by removing link vault, its all gone and back to normal.

I hope everyone else's sites come back up. But here are the tricks:
- Excessive anchor text (using same anchor text) about 1000 times.
- Even a few affiliate pages that go to those CJ, LS links. Remove them!
- File reinclusion request through sitemaps.

If you're lucky, you'll be out pretty soon...

[edited by: tedster at 3:19 pm (utc) on Oct. 31, 2006]

1:07 am on Nov 1, 2006 (gmt 0)

Senior Member

joined:Dec 29, 2003
votes: 0

can you please elaborate on the anchor text a bit more? "Excessive anchor text (using same anchor text) about 1000 times." you mean other sites linkd you the same anchor text?

>> Even a few affiliate pages that go to those CJ, LS links. Remove them!

that is not good advice, no offense. I know many sites that make a living off affiliate links and to remove them is suicidial. They are doing very well in Google too.

1:11 am on Nov 1, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

Agreed, walkman. It's not affiliate links on their own that are the problem. It's not enough unique content, or enough added value for the visitor, that creates the "thin affiliate". And Google doesn't want to offer a first page full of Tweedle Dee and Tweedle Dum.
6:46 pm on Nov 2, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 22, 2006
votes: 0

This has got to be one of the weirdest ones I've seen google pull in years.
Yes today, as of 11 am UK time, our site went from first page results in our sector to exactly position 31. Precisely 31. What does it prove? Of what benefit is it?
To the 2 banal questions posed by Adam the answer is a big YES, has been a big YES for 5 years.
How long has this mysterious dumping to position 31 been going on for?
Who else is in the 31 club?
Is the really any ryhme or reason to it or are we dealing with the likes and dislikes of some muppet at the plex?
Its not very professional really.
9:03 pm on Nov 2, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 22, 2006
votes: 0

Just a thought,
But I've just checked our google site maps diagnostic and its reporting 80 odd pages not found. A colleague uploaded a load of pages over the last couple of days and after checking them there a loads of typos on links to other pages on the site, hence google reporting 80 odd pages not found.

Is anybody else seeing this?
If so could this be the cause of the myserious minus 30 phenomenon?

10:21 pm on Nov 2, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 3, 2003
votes: 0

excessibve anchor text to me is only likely a problem when it on your own site

new york hotels
new york luxury hotels
new york luxury accommodation

etc x 1000 on one page. Even so, I am not sure its really a problem, i've not being able to prove it one way or another.

12:13 am on Nov 3, 2006 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 17, 2005
votes: 0

Maybe its this, maybe its that.

It's strange really, with people ready to steal your content, ideas, and domains. Try to fool you scam and spam you. Set lawyers upon you.
There's only one company with the absolute powers to really destroy you.

Don't build websites for search engines they say. Yet what do you think most of the people who been hit over the various penalties are now having to do?

Are they making sure customers are being looked after? or frantically trying to guess why they no longer rank. Without ranking soon you will have no customers.

So every day more and more sites disappear, doesn't matter there's new ones ready to fill the gap. And you can always pay your way back in through adwords; making this company ever more powerful. Made more in ads than uk's channel 4 you know (there grrrreat).

So your left with constantly trying to guess what may be wrong and what may be right. With no real help apart from vague statements 'do no evil' 'don't over optimise'. Check our web master guidelines. And read our vague answers.

Join our web master site get pointless pretty pictures about page rank and when our little spider visited you. Use site: link: and try and guess what's wrong. yet all the time the simple answer is whatever tripped the supposed (automatic) filter.

What is the problem? What have we done? Oh you can't be told that.. no far too easy. Then the nasty scammers and spammers will know what not to do. Yes them. Curse the evil ones making saint G look bad. Curse them and their adsense filled sites. eh? but...?

Is it affiliate links quick get rid of them. I heard someone said G doenst like them any more. But they're there for the customers. Doesn't matter soon we will have no customers. Put adsense up instead. but..?

Search your logs, search your content, keywords meta tags, links, sites that link to you, sites that don't link to you. What are other sites doing. Why the hell is that other site even listed? Why aren't we listed?

Fill a re inclusion request form and tick the box to admit to breaking unmarked unknown guidelines, even though you have no idea what could be wrong. You fixed it right? You know that unknown thing.

So how many small business people can do all this, how many are going to step over the unmarked line and be trashed?

Who cares about small fry anyway, there not going to fill our pockets.
We use your pictures we use your content, we track your visitors, we place ads on your site, we use your videos, we steal your books newspapers, we hold your emails, we show detailed satellite images of your home garden office business. If you put it on the net its ours.
What ya gonna do, we own the net.

Apparently they have lava lamps you know.

wooo hooo 100 pointless posts ;)

< continued here: [webmasterworld.com...] >

[edited by: tedster at 1:55 am (utc) on Nov. 14, 2006]

This 194 message thread spans 7 pages: 194