Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

The "Minus Thirty" Penalty - part 2

#1 yesterday and #31 today



1:49 am on Nov 3, 2006 (gmt 0)

5+ Year Member

< continued from [webmasterworld.com...] >

Hello All.

After some time after my site was affected by the -30 penalty, and after reading the latest posts left after my last message I did some research on my own site and site of my competitor which was affected by this penalty in the same day with my site.
I'll try to summarize all of our latest thoughts on this topic and real data from our sites and SERPs.


Even a few affiliate pages that go to those CJ, LS links. Remove them!

As I said earlier, my site is a 5-year old resource directory and consists of 5 pages.
Top ranked was always ONLY Index page. No other page was shown in top SERPs ever, and
this Index page was penalized.

The Index page has 84 links to external sites, 4 link to internal pages, and 9 links to affilliate sites.
The second content page has 161 links to external sites, 4 links to internal pages, and 0 affiliate links.
First Information page has links to 30 product pictures, 3 links to internal pages and 11 affiliate links.
Second Info page before penalty had links to same 30 product pictures as on the first page, 2 links to internal pages and 8 affiliate links. (To avoid dup. content issue I had yesterday replaced thease 30 links with links to other, different product pics).

Each and every link from our site, regular or affiliate is highly relevant to our site's subject and SE keywords, with no exceptions.

And after I had looked over all this data I see only two potentially thin affiliate pages on my site:
First and Second info pages. But honestly, I'm not sure can thease pages be classifyed as thin affiliate pages or not.

Okay, perhaps we found one potentially reason for -30 penalty, but IMHO it's not a reason.


Excessive anchor text (using same anchor text) about 1000 times.

Which exactly anchor text? Anchor text on my site for outbound links or text of the links to our directory from oter sites?
If outbound links - there are only 2, maximum 3 combinations for each SK phrase in anchor text on the Index
page and 8 repetition of one of the main keyword for this phrase. (97 links total on the page).
Is this excessive anchor text? I'm absolutely not sure.

About inbound links - I don't think It can be the reason. If it could be, then I can downshift my competitors site in SERPS just if I'll add hundreds of links to his site from different pages on different domains. I don't think G can be so easily tricked.

So, my site is not overloaded with excessive anchor text, but it still penalized, so perhaps it's not the main reason for this type of penalty.


Also, if there is in fact a -30 penalty that is manually applied, it could be something as
simple as - Writing a script to list all the top 15 sites for a previously specified selection of
search terms. (could be generated via another program, or by hand) Then, remove all sites that fall
into #*$!x parameters. (could be shopping cart based, or whatever. Pick your poison) The ones that
are left are used to fix the natural search... just in time for the shopping season I might add.

Not so simple. There are to many parameters. One site is about literature, other is about car tuning,
third is about history of art.
How to define what each surfer want to find? If I'm searching for "antique literature", it's does't mean
that I want to buy such books, maybe I'm looking for online texts or history of some books? If I
searching for "red cars" it doesn't mean that I want to buy them, perhaps I just want to find some
kind of online catalog of thease cars, or want to read about work process and how thease cars where
built. How can G knows what is inside of my head. If they run such algos, they incur to much, and finaly -
they can not make it just due to human nature. Of cource this is IMHO.


>>>>>>>>But the content is not exactly what user may want to see.... like page made xyz-pictures
has no pictures in it instead it has content which say xyz-pictures etc.

And finally as I think very interesting idea.
As I said, our site is a DIRECTORY. It consists of descriptions and links to other sites higly relevant to our narrow subject. In other words, we actually do not have on our site "red cars", but we exactly know where they are, and surfer can easily find them in our directory. But our directory is all about "cars", we have info where to find "cars", we have links to sites only about "cars" e.g. site is highly relevant to this SK, BUT site hasn't "cars" on it, and perhaps this is the possible reason for penalty.
More, my competitors site which was penalized in the same day with my own site, is the Directory site too, with the same subject and the same "problem". It has only links to SK's, but not the SK's by

BUT. As I can see from current SERPS, there are enough directories with our subject left in top SERPS, and they are not penalized due to it's nature. So, the truth is out there..... :)

[edited by: tedster at 1:54 am (utc) on Nov. 14, 2006]


8:00 am on Nov 3, 2006 (gmt 0)

10+ Year Member

lol appi2 - thanks for that! Made me smile :D

[edited by: TravelMan at 8:01 am (utc) on Nov. 3, 2006]


9:19 am on Nov 3, 2006 (gmt 0)

5+ Year Member

Just checked sitemaps again and its saying:
No pages from your site are currently included in Google's index. Indexing can take time. You may find it helpful to review our information for webmasters and webmaster guidelines.

However, I can still see our pages in google, checked all data centres and they are still there.

Is this the next stage in going to 31?


10:38 pm on Nov 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

They often send out messages first before removing your pages, see removal tool.


11:59 am on Nov 4, 2006 (gmt 0)

5+ Year Member

yesterday one of my sites also gotten the "minus thirty" penalty. I checked Google Sitemaps and every site i listed there said
"No pages from your site are currently included in Google's index. Indexing can take time. You may find it helpful to review our information for webmasters and webmaster guidelines."

After two hours that message in Sitemaps was gone..but the the penalty is still ON, I haven't used any 'questionable' techniques.
What did changed after this penalty was that if I do a site:website.com only half of the site shows, so it might be (a certain percentage)duplicate content tripped this filer?


12:23 pm on Nov 4, 2006 (gmt 0)

5+ Year Member

Hi tinus75
Same here, firs the message to say no pages included in the index then a few hours later:

"Pages from your site are included in Google's index. See Index stats"

All very odd, the site went to PR7 last month, it still is PR7 and all pages are in the index.

We get this in site maps as well:
HTTP errors (0)

Not found (68)

URLs not followed (0)

URLs restricted by robots.txt (2)

URLs timed out (0)

Unreachable URLs (0)

The URLs' not found has come down from 86, as we are cleaning up some pages that went up recently with really bad typos in the links, which would make the bot think there are pages there which clearly aren't.

Could it be a penalty that is triggered if you have too many URL's not found?
Could it make them think that you are up to something nefarious, when really its down to human error?
Our site is 5 years old and has been doing well, we really do try to not "upset" google.


12:58 pm on Nov 4, 2006 (gmt 0)

5+ Year Member

Oh and we also come up at position 31 for a search of www.site.com
Any other site that has our url in a link or on a page comes up before ours!


1:18 pm on Nov 4, 2006 (gmt 0)

5+ Year Member

hi avalanche,

yes exactly the same, but I don't have any errors in Sitemaps.
so don't think it will be tripped because too many URL's can't be found.

I had a css-scroller with 'scrollbar-face-color; FFFFFF;' it just blended better with rest of the design. Changed that >> #F4348A ;)

Personally i think it's something on-site, I have another site build with exactly the same structure, same IP, different content (but same branch), but aggressive link-building and it hasen't been effected.


2:03 pm on Nov 4, 2006 (gmt 0)

5+ Year Member

Thats a shame, I thought I was on to something there, still we need to clean that up anyway.
Is that the only change you ahve made recently, that you think could have caused this?

I don't think its manual (have no real evidence to say its not manual, just more faith in an algo causing this mayhem), I do think there is some part of algo that is a bit weird.

Totally off topic but I just did a product search for HBH 300 - its a headset, mines faulty. The results here in the UK are utterly weird. First page seems ok, go to page 2 and you start to get a URL with no description and a link for similar pages below. Never seen that before and it goes on for page after page.


2:55 pm on Nov 4, 2006 (gmt 0)

10+ Year Member

My experience with the plus 30 penalty

National site with 50,000 indexed pages. Been around for a while with a PR5 home page. Hit position 31 in middle of June this year. Where I differ a little, is that the home page still shows up number one for several very competitive terms. Home page also shows very good for major city names. However the city page shows exactly in position 31 across the board.

So my site has exactly what the rest of your guys have, except the home page is showing up good.

By the way a month or so ago Yahoo started indexing us and now we have over 6,000 pages indexed. Has been a life saver.

We do have a page generation system creating a page for every city and county in the us. Same content, but the public does seem to want to make sure you serve their area. That new system was started in April and we go this posiiton 31 filter two months later.

Like others, I have started a test to put original text on a page that has a PR to see if that page started to show better than position 31.

I do have a few pages that were on a slightly different subject that do show up and also the site wide pages like FAQ and How to Post pages do show up and are now affected.

In my case, I am thinking it is not enough on page original text. Competitors have the same type pages but have some different geographic info about each city and area.

I don't think it is the meta files but on page content. I think it is automatic not manual. Just my thoughts.


11:11 pm on Nov 4, 2006 (gmt 0)

10+ Year Member

One of my sites has also been hit with the minus thirty penalty. It's a site-wide penalty. The site hasn't had any serious SEO.

After receiveing the penalty, I did clean up some minor keyword density issues on the home page and removed some very limited cross linking among pages, basically some duplicated page navigation.

I don't think either of these problems were even close to being serious enough to warrant a penalty - the site is pretty light on content and has a good deal of affiliate links embedded in it - that's my best guess of what the problem is with this particular site.

I have other sites with a loosely similar setup that haven't received a penalty - I wish I knew a little more about what was causing this site to get whacked.

I added some additional functionality to the site and submitted a reinclusion request - though I doubt it will have any affect and, knowing google, may actually hurt the site's rankings.

I've also started adding some additional content to the site - though that will take time. I'd like to think I can do enough to get this site penalty lifted - but I'm not holding my breath. I may have to move the site, as I have another domain I could place it onto - but then it may just get penalized again - and it would be a good deal of work on my part wasted if that happened - so I'll keeping adding content for the time being.

I can't say I have an anchor text issue as floated here earlier - I could only hope it was that easy.


4:06 pm on Nov 5, 2006 (gmt 0)

5+ Year Member

Just got to say that appi2's post is probably the best post about google I've read in a long time. It really has turned kafka-esq in the last couple of years. You're accused of something, but you're not allowed to know what the charge is, just polite hints of, but you must know what the problem is, right?


10:49 pm on Nov 6, 2006 (gmt 0)

10+ Year Member

I think the limited or rather non existent amount of info coming out from Google on this suggests that they are aware that they may be treading on very unfirm legal ground with this type of penalty.


1:14 am on Nov 7, 2006 (gmt 0)

5+ Year Member

I think the limited or rather non existent amount of info coming out from Google on this suggests that they are aware that they may be treading on very unfirm legal ground with this type of penalty.

After analizing of all limited info on this type of penalty, including some comments on Google forum, it still looks like "like/dislike" opinion of unknown editor who can personaly decide that there are more interesting sites on the web than yours, and becauce of this, you must go to 31 position just because "editor" decided so.

Up till now I can't find any similar problems on different sites affected by this penalty which can potensially be the reasons for this penalty. I can't find any system and I can't see even pieces of this puzzle at the moment.


2:15 am on Nov 7, 2006 (gmt 0)

10+ Year Member

Put the tinfoil hats down people.
Whether its the algo or a googler or a chimpanzee called Dave doesn't matter.

Now for a bit of Google love! (sort of). Been watching <Google Groups for Webmaster Help in Crawling, indexing, and ranking> for the last couple of days. It's like some form of torture but without the orange jump suits.

You will see some (very few) sites where you can't see why they've been hit.

You will see many where they should be beat about the head with the aforesaid lava lamps.

Just try it, pretend your a googler and review each site that pops up. Don't worry you wont see a real googler within a hundred posts. Idle ...

Now if you spent more than 4 hours doing that you should be ready to classify all those with penalties as complete and utter?

Now does Google have a problem? - Only answer if you did the 4hrs.

Lets just hope a different class of webmasters are posting here.

[edited by: tedster at 6:22 am (utc) on Nov. 7, 2006]


2:28 am on Nov 7, 2006 (gmt 0)

10+ Year Member

lol.. thanks for lending your foil hat!

Any threads in there that deal with this topic specifically (-30 Penalty) Sure there are alot of people out there cheating google.. but we're trying to figure out how the -30 penalty is assigned. So far we've seen one thread that was didn't offer any specifics that is posted earlier.

Anyone making any progress on the penalty? Updates?


[edited by: AustrianOak at 2:29 am (utc) on Nov. 7, 2006]


4:08 am on Nov 7, 2006 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

It looks like we've joined the - 31 club , but i wonder if the filters are applied as holding patterns pending QA [ Trust ] validations?

Then i wonder if these are released as each validation is passed, or held if they fail.

2 of our sites are indexing very slowly. When i put mysite.com into the search they come up at No 31. I would expect them to be at No 1

These big sites are awaiting full indexing following duplicate content problems which were fixed around 8-10 weeks ago

btw - we have one site at -21 , does this mean it's set to a lesser problematic level? Interesting. - Then we have another site with the same re indexing cycle and this sites at -11

These last two sites are more advanced with indexing

Just some observations i thought worth sharing.


8:41 am on Nov 7, 2006 (gmt 0)

I am starting to believe that an editor's vote might have doomed us to 31+ club. If you recall, they were instructed to rate sites that they thought were thin affiliates and offered no added value; while they might have gotten most right, mistakes do happen.

I know GoogleGuy had said that their ranking was passive, but honestly, I don't know if I believe it. I am looking at my pages and they are very, very different from each other and from related sites, yet I see no progress, despite many crawls.

I have even asked, and I am waiting, for a site to remove an unsolicited sitewide link...I have to do something.


8:44 pm on Nov 7, 2006 (gmt 0)

10+ Year Member

Hello everyone,

I'm the person who has had the penalty removed. I've been busy lately so couldn't respond. In my opinion, since I had about 2000+ links pointing from link vault, and 200 something from quality sites, I really didn't get banned or penalized. I think my domain was suffering from anchor text overoptimization as well as from 'think affiliate' pages. I thought that by removing link vault links to my domain would hurt me, and it did. I was pushed from #5 position to #8 just by removing all those links, but it's better than sticking at #31, right?

Now, in the reinclusion request I basically mentioned that I have been bad and specifically said that my domain's suffering from -30 penalty. I also said that I removed all the codes and affiliate links, and voila, I was out within 2-4 days.

Also, for a long time, about 14 months or so, I was using IP-based cloaking and I updated my IP list almost everyday (not agent based -- thats sooo 90s) to hide link vault and DP links to all SEs. Luckily, I have gotten away with cloaking so far. However, remember that these interns at Google have those Firefox extensions installed where they use different proxy servers to see a domain when they review your site. They also view the source codes and if they see a discrepancy, you might get banned forever. So, be careful there.

Good luck people. I have learnt that you've to be really honest when you file a reinclusion request.


10:24 pm on Nov 7, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

*What I do not understand is why Google feels it is necessary to penalize an entire site why not simply remove all the offending pages, and leave the rest alone?*

"There are many reasons why we focus on sites rather than pages in this context, including:
- It sends a more clear signal to Webmasters so that violations typically get fixed ASAP.
- It prevents Google users from seeing a domain rank highly, despite spammy practices on many pages.
- It makes it harder for Webspammers to experiment with thresholds ("Let me try keyword stuffing and nasty redirects on THESE pages... and keep these other pages as the control group and let's see how Google treats the individual pages...") "

Adam Lasnik


4:24 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

does anyone know what causes the -30 penalty to be implemented?


4:52 pm on Nov 8, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

>So, does anyone know what causes the -30 penalty to be implemented?

Has anyone looked at someone else's site and not seen an obvious reason for the penalty?


5:08 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

Well, Its been suggested to me, link purchasing and certain link exchange programmes,
but that probably takes care of 70 per cent of the web,
Anything more specific?
It is a very different kettle of fish compared to out right bans etc.
Thought it deserves more analysis


7:41 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

I have done very, very few link exchanges (3-4) and purchased about the same number of links, so I really don't think that is the problem unless you go nuts doing this. If you do, caveat emptor.

Like I have stated before, my site has been nailed with the -30 serp penalty for the past 11 months. My problems didn't surface until I hired an outside programming firm to upgrade my site. Heck I was making good money and wanted to invest profits in my business, making it any even better user experience..

Within a month, of them putting up a dev site, the -30 penalty appeared. The programming firm had not excluded googlebot or put in noindex, nofollow on the mirror development site. I didn't catch that until it was too late but fixed it immediately. Never again will I go outside with any development work.

Later I found that they had incorrectly done error handling. When a hotel was no longer in the database, they first used a 302 redirect causing literally thousands of 302s. I found that in March and told them to fix it. Well, their fix was to use a 301. I didn't catch this problem until October 19. I then fixed that and filed another reinclusion request. Since I thought any qualified programming firm knew how to properly handle errors, I looked elsewhere in my research.

I really think this -30 penalty is manually applied. Therefore, it has to be removed manually. My problem with Google is the lack of communication with whitehat webmasters.

Anybody with a 6 month -30 penalty deserves communication from Google.


8:40 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

Now thats just crazy.
You have been penalised unfairly, did you mention all of the above in your request to G?
You've had no contact in 9 months - horrible.


9:16 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

Yes I mentioned that in great detail to Google. I have filed at least 4 reinclusion requests since March. Sadly, I have not heard a word from Google since March 9 and that was regarding Big Daddy.

And all the visibility I have brought to this -30 serp penalty has been almost totally ignored by Google. Adam Lasknik did respond to my thread at [webmasterworld.com...] but even he has not responded in several days.

Apparently, this is definitely a topic that Google does not want to address. I can't think of anything more unfair!


9:29 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

Completely unfair.
Also, it could be a great oppportunity for google to reach out to webmasters and say that yes it is a penalty, this or these are the reasons for it and if you fit in that category or categories get it fixed and file a re-inclusion request.
Surely its in the interest to not punish those who have made simple errors, it makes for a better web and ultimately better search.
Its true that google do reach out more than most search engines to the people that build content for the web, but its appearing quite stiffled at the moment.
They reached out over the 302 re-direct problems that hit many sites.


9:50 pm on Nov 8, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

They reached out over the 302 re-direct problems that hit many sites.
Umm... 302 debacle was actually THEIR problem. It was and still is a valid HTTP response. People have found malicious ways of using holes in Google's (and others, to be fair) algos, but that's entirely the search engine's problem. I would say this is why there was any communication back from them.

In this particular case of -30 penalty I don't hold my breath waiting for Google's response. All in all, this is not something that will drop a shadow on their public image (99.999% of the public would have no idea what you are talking about) and therefore no official response will ever be given 'cause they don't have to unless it affects their shareholders' interests.

I guess, in general, Google accepts some degree of collateral damage. So, get over it. I have.


10:19 pm on Nov 8, 2006 (gmt 0)

5+ Year Member

1script, has your site been written off by a minus thirty penalty?

I think if they really didn't care, they wouldn't have reps posting on this forum.


11:19 pm on Nov 8, 2006 (gmt 0)

so far we have confirmed that links will do this:
"We will take action to level the playing field, however, when we see *significant patterns* of sites buying or selling links for the clear purpose of manipulating their ranking (rather than buying traffic). "

What if a political blog adds my 88 by 31 image and links to my totally unrelated site sitewide? Can we be assured that this penalty is manual (at least a person looks at it) and that we can email to explain that we didn't buy it? This is scary, and I am not even getting to the competitors spending $50 to buy a sitewide for others on cheesy sites.

If google spots suspicious activity, why not ignore those links instead of penalizing everything? The buyers will be punished as they are shelling $$ for something of next to nothing in value. Soon enough the word will spread and ratings are NOT being skewed since those links are x-ed out by google anyway. Most normal sites "cheat," they have to in order to even get indexed, let alone rank in Google. Another site selling shoes will not make it into NY Times or Digg, but they are nonetheless a business with maybe different service, strategy etc. Unless they get some high PR links, months will pass before GoogleBot even sniffs at them. I would argue that those who get sitewides are making it easier for google since they can be spotted and have that particular link effect blocked right away.

My site has been hit for over a year and it sucks since I have to update daily anyway--without a good return on my investment. I have just started to ask people to take down links again. This is ridiculous, especially when I see my competitors with gazillions of obvious link exchanges flourishing.

This strikes as punishment not as a solution to a problem.

This 151 message thread spans 6 pages: 151

Featured Threads

Hot Threads This Week

Hot Threads This Month