Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google's 950 Penalty - Part 10

         

netmeg

8:26 am on May 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



< Continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I have two urls that were 950'd on Friday. Unfortunately they are my two most requested urls (other than the home page) - they'd been #1 in the SERPS for at least three years, and now only show up on the last page for the most common search phrase. They both showed PR previously, but TBPR has been greyed out since the last update. However, in certain permutations of the search phrase, they they still rank #1. The search string usually comprises the city name and the event, and often includes the year.

Example:
city event - 950'd
city event 2007 - 950'd
event city - #1
city state event - #1
city state event 2007 - #1

As far as I can tell, it is ONLY two urls, out of around 500, that fell into this (so far, anyway)

What it all means, I have no idea.

[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]

Localizer

10:06 pm on Jul 15, 2007 (gmt 0)

10+ Year Member



About 2 weeks ago, my traffic tripled/quadrupled up (nothing changed), but now my organic G traffic is completely dead. The whole site is still fully indexed, but at the end of the results.....

[edited by: Localizer at 10:06 pm (utc) on July 15, 2007]

Biggus_D

3:57 am on Jul 16, 2007 (gmt 0)

10+ Year Member



Yeah, I see changes. Instead of one now I have 2 sites in hell.

Englishuk

2:19 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



Bump.... Any updates, anyone come back out and stay out?

Miamacs

3:11 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



I could only repeat what I've said before. I'm using it in practice.

...

Sites on which I redesigned the nav anchors to be more reasonable and relevant, and/or got links with a variety of relevant phrases, and/or some additional variety in word order, and/or solved any other issues of dupe content, and/or 404s, and/or out of balance pagerank, and/or removed pages targeting market sensitive stuff that the site never had a single IBL for... are all out of the -950 area.

Problems can be different, solutions can be different, result is the same. Get the site in shape. Get some links.

All that said, I feel for those whose problems aren't that easy to spot.

MrStitch

3:40 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



My target term has been for Widget Blocks. Ranking is gone, but I still rank for Wooden Widget Blocks... #4 to be precise, plus a few other variations. Tells me that i'm being tanked for the specific term.

So, I've cleaned up some of the on-page seo, removed double linking on a specific page (wasn't seo related tho, which makes me mad), added no-follow to links going to utility pages, changed my index.php back into an index.html (long story behind that one), and then issued a re-inclusion request a couple of days ago.

I'll just have to wait and see.....

bobsc

4:02 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



Sites on which I redesigned the nav anchors to be more reasonable and relevant, and/or got links with a variety of relevant phrases, and/or some additional variety in word order, and/or solved any other issues of dupe content, and/or 404s, and/or out of balance pagerank, and/or removed pages targeting market sensitive stuff that the site never had a single IBL for...

Been there, done all of that. No change!
No messages in the "Message Center".
Question: What is Google trying to accomplish with the 950 penalty?

TaLu

4:26 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



I think this is a manually penalization in some cases.

Miamacs

4:50 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



Question: What is Google trying to accomplish with the 950 penalty?

It's a hack to address their own internal problem, quality vs. quantity. Their previous calculation of authority links allowed to much leeway. This is how they address it.

Sites that have the trust, but have certain other problems are now not "pardoned" by default. Sites with only a couple of very good links that target major keywords and phrases are sent to the back until they gain some variety and a larger mass of votes from anchor text. Sites with faulty HTML, sites with spammy SEO practices... list goes on.

Basically, trusted sites with problems.
The hard part is to spot those problems.
I've only listed the ones I had encountered in the past.
That list is not complete, no list could be as there's an endless number of things that can alert the quality, accessibility and spam filters.

Technically speaking, this is a filter.
A script combs through the results, and sends sites to the back, but the ranking stays the same in the main database. Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.
In practice however, this is a penalty. However it can be lifted by addressing the problems. With every recalculation of these triggers, pages pop in and out. Either because the page, the overall dataset they compare it to, or the thresholds themselves have changed.

As far as it being manual, I haven't seen a single case in which I found even a hint of it not being completely automated.

edit reason: tired. I messed up the comment on the &filter=0 stuff

[edited by: Miamacs at 5:01 pm (utc) on July 20, 2007]

SEOPTI

5:08 pm on Jul 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No, don't make the internal anchors reasonable and relevant, this will be the cause of -950. Make your internal anchor text NOT relevant instead, use some sort of characters with almost no meaning at all.

[edited by: SEOPTI at 5:10 pm (utc) on July 20, 2007]

trakkerguy

5:25 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



As far as it being manual, I haven't seen a single case in which I found even a hint of it not being completely automated.

Absolutely agree Miamacs. I've seen some that seem "stuck" at 950 for now reason, but eventually recover somewhat. Seems there is a big delay built in for some things, but is not manual penalty.

A script combs through the results, and sends sites to the back, but the ranking stays the same in the main database

Right. you can see in webmastertools where google says you rank high for a term, but in the search results you are end of serps (or now sometimes middle)

bobsc

6:56 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



It's a hack to address their own internal problem...

Ha - It's a hack all right.

What I'd like to know is why only certain pages get hit - namely a Home page with no apparent problems.

I just don't get it!
There are sites out there getting away with murder while others get penalized for nothing.
It seems to me Google needs to turn some knobs and get it right.

bobsc

7:16 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



Technically speaking, this is a filter.
A script combs through the results, and sends sites to the back, but the ranking stays the same in the main database. Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.

Nope - In my case always 950.

jk3210

10:22 pm on Jul 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.

Yep, back to number 1.

tedster

11:30 pm on Jul 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So we have two different -950 situations, then. One is a re-ranking applied via a filter, and one takes place before the filters are applied. That may be a very important distinction in trying to understand exactly what is tripping the -950. As several members have said, -950 is just a mechanism, and it might be triggered by several different criteria.

bobsc

11:56 pm on Jul 20, 2007 (gmt 0)

10+ Year Member



Would age be a "problem"?
Seems to be a big factor from what I see.
My site is almost 3 yrs. old and it seems like I'm back in the sandbox again.
950 for the BIG terms which my Home page use to rank well for.
The sites that rank well for the BIG terms are older.

[edited by: bobsc at 12:37 am (utc) on July 21, 2007]

CainIV

12:04 am on Jul 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What is interesting is that I see the same issue in the serps that was present about 2-3 months ago where websites that are filtered in the results often can show up at various positions (lets say 400th, 610th, 890) in the same set of results on the same server ip.

tigertom

6:38 am on Jul 21, 2007 (gmt 0)

10+ Year Member



Suggestions:

- Deoptimise drastically, then _slowly_ wind it back.
- Get more backlinks to compensate
- Develop more domains with different templates.

Putting all your hopes on one heavily SEO'd domain is, and probably always was, a mistake. Your site looks 'unnatural'.

I have a meta search engine script on one of my sites. It queries major and minor SE's. The amount of crap some throw up is informative.

The -950 filter is an anti-SEO, anti-spam one. Business as usual, in other words.

steveb

9:11 am on Jul 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



technically speaking this is a penalty. It's very m,isleading to call it a filter, and jusr because less than .001% might come back with filters turned off, that is certainly no reason to ignore the 99.999% of the pages that do not.

Similarly the -30 thing is a clear penalty. Nothing is being removed, things are being moved to an approximate specific place.

ALbino

9:23 am on Jul 21, 2007 (gmt 0)

10+ Year Member



Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.

Yep, back to number 1.

Wow, us too for several kw combos I just checked. Well that's interesting.

arubicus

10:19 am on Jul 21, 2007 (gmt 0)

10+ Year Member



Turn off the filters ( &filter=0 ) and see for yourself, sometimes the page is not -950.

We have been suffering for a while now. Decided to do the &filter=0 and we stay pretty much the same. Last page almost mid every time. So in our case it seems more like a penalty before the filters. Now this happens with optimized terms and even obscure phrases taken from the body of a page.

Miamacs

11:13 am on Jul 21, 2007 (gmt 0)

10+ Year Member



Hmm.

Tedster may be right, but I'd like to get a hold of this. While there's a lot of reasons to be sent to the back of the SERPs, I just don't think that they have more than one script for this... and at different levels of the ranking calculation. It could be, but my gut feelings tell me that the real answer is somewhere else.

I looked up all the examples ( those which remained -950 since my last post ) and turning off the filter brings back the pages at their regular position.

The strange thing is that querying the database directly reports these ( non-penalized ) results for only half of them. Not all.

But in response to the comments on the &filter=0 parameter, it works for the sites / with the problems I work on so I can't put this into perspective... I see it on all of them, for all phrases. Of course, as I've mentioned, there are only about five to ten reasons and their combos which I encountered so far, so it's not unlikely that they do not apply for everyone.

Of course you do remember that the &filter=0 parameter also turns off clustering.

If there's a site with 600 relevant URLs above your original ( non penalized ) rank, you'll be pushed down 600 positions by the idented "additional" results. The one you'd see when pushing the "dupe content" link... you know, repeat the search with the omitted results included

edit: ran a spell check with the other side of my brain. w/ not much success.

[edited by: Miamacs at 11:18 am (utc) on July 21, 2007]

bobsc

11:26 am on Jul 21, 2007 (gmt 0)

10+ Year Member



Deoptimise drastically, then _slowly_ wind it back.

If I deoptimise any more I'd lose ALL Search Engine traffic.

bobsc

11:59 am on Jul 21, 2007 (gmt 0)

10+ Year Member



In my case it also seems that Google has a problem dealing with a particular AdSense poison word(sexy) on some pages.
Don't know why many sites use it.
Maybe I need to deoptimise the "poison word".

tigertom

12:15 pm on Jul 21, 2007 (gmt 0)

10+ Year Member



There may be other factors:

- Too many keywords in internal links
- Too many identical anchor text phrases in external backlinks (appearing too quickly)

I regard my penalty as a 'wake up call' from relying too much on 'old school' SEO for traffic.

I'm guessing it's too much of that + poor quality backlinks are the cause. Poor quality being: all low PR + identical anchor text.

What does a cr*p site look like?

- Popular template
- Larded up with keyword links
- Competitive niche
- 3000 identical backlinks from 'directories'
- Few links from high quality, related, sites.

I guess Google, having access to loads of data, + human reviewers, can spot a standard cr*ppy site easily. It tweaks its algorithm, some sites fall out, other cr*p sites fall back in, but if the next effect is cleaner SERPs, job done.

Englishuk

12:25 pm on Jul 21, 2007 (gmt 0)

10+ Year Member



Tigertom, so once you have the wakeup call, is it worth carrying on with the same domain, or just move on?

If this is an automatic filter, I can see that once the offending material has been removed/backlinks changed you should then lift yourself out of the filter and bounce back. Unless there is some kind of time penalty. But at least then eventually you can come back.

If it is a manual filter then it will be more difficult. It could take months for someone to get around to reviewing your site. Maybe never.

With so many people effected, it's hard to beleive it is manual, unless they have employed 10,000 people like another thread here suggests to manually filter sites. But why on earth would a company who has success from an algorithm change to humans instead of just tweaking the algorithm?

Still think the scary thing is how easy it would be to knock a rival to the end of the SERP's.

tigertom

1:42 pm on Jul 21, 2007 (gmt 0)

10+ Year Member



It isn't a manual penalty. Google = The Google algorithm.

The main cause is over-optimisation:

- Making a page too 'sweet' for a keyword, and
- Using straight keywords in internal links, and
- Having lots of keyworded internal links to other internal pages, on each page.

Scale that back and get more (quality) backlinks (to internal pages). Getting quality backlinks is a good use of your time anyway.

Algorithm: IF a page is larded up with keywords [including 50 keyword-heavy links to other internal pages] AND it's got no good external backlinks AND it's a competitive phrase THEN -950

If Google didn't do this its results would be like MSN's.

In my niche it's a good mix of major leaguers plus cheeky upstarts like me. Not too much rubbish.

europeforvisitors

2:31 pm on Jul 21, 2007 (gmt 0)



With so many people effected, it's hard to beleive it is manual, unless they have employed 10,000 people like another thread here suggests to manually filter sites. But why on earth would a company who has success from an algorithm change to humans instead of just tweaking the algorithm?

It hasn't, as you'll see if you read that thread in more detail.

bobsc

2:40 pm on Jul 21, 2007 (gmt 0)

10+ Year Member



The main cause is over-optimisation:

- Making a page too 'sweet' for a keyword, and
- Using straight keywords in internal links, and
- Having lots of keyworded internal links to other internal pages, on each page.

Scale that back and get more (quality) backlinks (to internal pages).


If only it was that simple.
Believe me it's much more complicated than that.

If Google didn't do this its results would be like MSN's.

Thank the Lord MSN doesn't have a 950 penalty.

tigertom

4:14 pm on Jul 21, 2007 (gmt 0)

10+ Year Member



Well, it worked for me.

Then only other thing I did was to punt content to other aged domains.

That was too drastic, but I didn't know what the true cause of the penalty was at the time.

bobsc

4:36 pm on Jul 21, 2007 (gmt 0)

10+ Year Member



tigertom,

Your suggestions seem valid for the "re-ranking applied via a filter" cases, but seem to have no effect for the "before the filters are applied" scenario.

Was your case "re-ranking applied via a filter"?

This 155 message thread spans 6 pages: 155