homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 176 message thread spans 6 pages: < < 176 ( 1 2 [3] 4 5 6 > >     
Google's 950 Penalty - Part 6

 10:16 pm on Mar 10, 2007 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I've been so quiet because MHes has said most of the things I would have said anyway.

Mind me, part of this is theory. I see instances, experience certain patterns and behaviours, then analyze what's in front of me. And the end result is what I call SEO. For the rest of the day at least.

Some points to remember...

As Martin mentioned, the anchor text of links from trusted/locally trusted sites is what decides 98% of what's in the SERPs. Title and body text are criteria to be relevant/filtered, but are thus binary factors. If present, and are matching the incoming anchor, or even the theme of the anchor, the page will rank. Meta is optional.

Title and the content text have two characteristics that are connected to this problem.

One being, that every single word, and monitored phrase gets a scrore. 7 word phrases are not monitored. Monitoring is probably decided based on search volume and advertiser competition, ie. MONEY. So there's no infinite number of them.

Second is, should the page gather enough votes from inbounds / trust or localrank through its navigation for any single word/watched phrase, it passes a threshold that will decide the broad relevance of the page. The page could be relevant for more than one theme. It could be relevant for "Blue Cheese" and "Blue Widgets" if it gets inbounds for both themes. ( Note I'm over simplyfying things, relevance is calculated long before that. ) If it's relevant for "Cheese" Google knows it's probably about "food".

The theme of the page now will make it rank better for certain queries. These aren't necessarilly semantically related. A site that ranks #1 for "Blue Cheese" may rank relatively better for "Azure Cheese" than before, even though this phrase in nowhere in the anchors or titles, and only appears in parts of the content.

If you cross a certain line of on-page factors, another theme might be evident to you, based on the title/content. But if the page does not have any support for that theme in the incoming anchor text, this may be viewed as trying to game the system if Google doesn't understand the relation. "Blue Cheese" IS relevant to "Kitchen Equipment" to some degree. Google might not know this.

Another, blunt example is mixing up "thematic relevancy" with "semantic relevancy", when your "Blue Chese" page starts to have an excessive number of instances of blue things, like "Blue Widgets", "Blue Hotels". Google will think that this is because you have noticed you can rank well for Blue. And tried to add a couple of money terms that are semantically relevant. But what AdWords, Overture or Trends, or in fact Google Search does not show... is that the algo now knows these things are not related.

Question is... to what degree is this filter programmed.


1. If you have N number of kinds of phrases on a page that are only semantically relevant ( ie. as "blue cheese" is relevant to "blue widget" ), and you don't have support for both, your site gets busted. If popular phrases, that you know to be thematically relevant to your page, aren't in the Google database as so, you're busted. Based on the previously mentioned problem, if you have a website that's relevant for modeling, and add internal links with names of wars all over, Google may not find the connection.

2. If you do a search on AdWords for "Blue", you'll get a mostly semantically relevant list of keyphrases that include/are synonyms/include synonims/related to "blue". A human can identify the "sets" within these phrases and subdivide the list into themes. Spam does not do this, or so Google engineers thought.

3. So there are subsets in the hands of Google that further specify which word is related to which. These are themes. You'll see sites rank for synonyms within these sets if they're strong enough on a theme, even without anchor text strenthening the relevance. A site that's #1 for "Blue" might rank #9 for "Azure" without even trying too hard.

4. If you have a site about "Cheese", you can have "Blue Cheese" and even "Blue Cheddar" in the navigation, titles, text, for they are included in the same subset. You can't have "Blue Widgets" on the "Blue Cheese" page.

5. What constitutes these sets? Who decides on themes and based on what? What is the N number of "mistakes", how well determined are these?

But then, so are the SERPs right now. There's at least 4 different kind of ranking I see in the past 3 days.


So far I've only seen instances of filtered pages when 5 to 6 themes collided all at once. Quite easy to do by chance if you have completely legit "partners" or "portfolio" page with descriptions, and/or outbound text links. But only a single theme that's supoorted with the navigation/inbounds, and only if there is a decided theme for the page. If there's no theme ( navigation and - lack of - inbounds doesn't strengthen either ) I'd say Google passes on penalizing.

As for the themes, I was thinking perhaps Google went back to the good old directory age, and started from there. Remember how you started with the broad relenacy, then narrowed it down to a theme, then an even closer match? With cross references where applicable.


This isn't new. Penalties that are based on it are.

If there is such a penalty it is by these lines.

[edited by: tedster at 9:16 pm (utc) on Feb. 27, 2008]



 10:52 pm on Mar 18, 2007 (gmt 0)

There certainly are different things going on.

I couldn't agree more. I believe the causes are different, but it seems they are all leading to the overall same effect.

Is it possible that Google is shifting their overall penalty strategy to a standardized "penalty box"? Think of it like a hockey game, different penalties can be committed, but no matter what all players end up in the same box.

This will prevent individual penalties from being analyzed and cracked. One person may have unnatural links, another can have duplicate content, but they both end up with the same result.

Seems to me that threads based on old penalites like -31 have pretty much disappeared and most sites having problems are on the 950 bus.


 11:23 pm on Mar 18, 2007 (gmt 0)

Is it possible that Google is shifting their overall penalty strategy to a standardized "penalty box"?.....Seems to me that threads based on old penalites like -31 have pretty much disappeared

Good question. It would probably be an effective strategy for Google. You have a signal there is something they don't like about a site, but no clue what. So (some) webmasters scramble around and clean up anything they can find that they think Google doesn't like. And others are more aware to be careful.


 9:05 pm on Mar 20, 2007 (gmt 0)

Removed some quality links pointing to a site not in the 950+ box.
Effect: site listed 950+ now

So it seems like quality links might "heal" the 950+ PEST.

[edited by: SEOPTI at 9:07 pm (utc) on Mar. 20, 2007]


 2:29 am on Mar 21, 2007 (gmt 0)

I just saw my site bounce back from this 950 penalty this afternoon. I got hit with the 950 penalty about 3-4 weeks ago. When I saw it, I had a suspicion as to what caused it - a new blog I added into my site probably tripped it up with "run of the blog" internal links back to the main domain's home page with the target keyword phrase in them. I removed those links pretty quickly after getting hit, and as of this writing I have recovered my pre-950 penalty ranking.


 9:48 am on Mar 21, 2007 (gmt 0)

SEOPTI and egomaniac

Both observations would be in keeping with a phrase based penalty. It's always difficult to know when offline analysis is done by google, but I would be suspicious of too many blog internal nav links flagging me up for 950 and I would also be concerned that removing some links may result in any 'redeeming' factors which compensates for 950 being lost.


 12:04 pm on Mar 21, 2007 (gmt 0)

Many of my pages went -950 after adding over 100 new pages, all of which text-linked back to my homepage and another page using my main keywords. I have now replaced these text links with graphic links and, fingers crossed, hope that will do the trick. Does anyone think this might work?


 12:53 pm on Mar 21, 2007 (gmt 0)

>Does anyone think this might work?

A theme could be carried by an image link from one page to the next, even without anchor or even alt text. If you are trying to lower your potential N count for predictive phrases on the target page then I don't think this will work..... it all depends on what 'themes' the image links carry with them.


 3:55 pm on Mar 21, 2007 (gmt 0)

It certainly could be coincidence that I deleted those links, and my ranking recovered on the next 3-4 week "update cycle". There's no way of really knowing for sure.

Another detail that is important...

The site has been in Google, and high-ranked for a long time. Prior to adding the blog last November, my adding of pages and changes to it were very, very infrequent. Sometimes no change for months at a time. Then in November and December I started adding pages via the blog. And in January I started blogging 5 days a week.

So the net effect was lots of new internal links pointing back to the home page growing at rapid, non-historical rate.

I don't think that its purely the links - I think there are other factors in combination that caused the links to be the gating factor.

Those are my thoughts.


 4:10 pm on Mar 21, 2007 (gmt 0)

I don't think that its purely the links - I think there are other factors in combination that caused the links to be the gating factor.

That may well be. Often we only have to eliminate one factor to bring the page back.

In this case think usability. How many users are going to read through a long list of links? I think a lot of sites overdo the internal linking.

Usability information suggests no more than 7 links on a page. I'll admit that I go often over that. But think about it, you really just need a few links in your navigation.

One to the homepage
one to the contents of the sub theme
2 or 3 to other pages on your site that relate closely
1 or 2 outgoing links to quality related pages if appropriate


 7:11 pm on Mar 21, 2007 (gmt 0)

I've done a major clean-up and I'm coming back!

....on Yahoo, that is.

As before, I'm not convinced that any current Google trouble has anything to do with penalties or filters.


 10:59 pm on Mar 21, 2007 (gmt 0)

Wow .. freaky, I had never heard of a google 950 penalty, but it describes exactly what happened to my site

for one search term I was on the 1st page on google, yahoo and live.com .. now its first page on yahoo and live, last page on google.

now I just have to read thru the 1000s of messages here to see what I can try to do about it - after reading a few posts my best bet is that it is over optimized

i have other search terms that have not been 950'd and they are just as optimized, hmm strange.




 2:57 pm on Mar 22, 2007 (gmt 0)

Another page came out today after a few changes, but I found a new directory had been hit (GAH!). One keyword set in particular seems to be contageous. Any directories that have a page on that 'theme' get whacked. The original root page with this theme still hasn't recovered after 9 months pretty much (apart from a two week period).

I see tons of other decent sites with pages about these keywords down there in the 900s with me :(


 4:14 pm on Mar 22, 2007 (gmt 0)

Here are two theories that are being floated for some of the end-of-results problems:

Theory 1 - in some cases this may be what Google is doing with "thin affiliate" pages.
Theory 2 - in some cases this may be what Google is doing to sites that buy links.

Any comments? My thinking is that if such penalties were true, they would impact every search and from what I see that's not true. But maybe some of you have examples where it is true.

[edited by: tedster at 4:17 pm (utc) on Mar. 22, 2007]


 4:17 pm on Mar 22, 2007 (gmt 0)


Add item 3, Bounce rate. Google is probally looking at traffic patterns as well and seeing that traffic bounces, does not add to favorites, etc.....

Item 4, No value added--- Too much duplicated content with little or no value to the user...


 6:26 pm on Mar 22, 2007 (gmt 0)

<<Theory 1 - in some cases this may be what Google is doing with "thin affiliate" pages.>>
<<Theory 2 - in some cases this may be what Google is doing to sites that buy links.>>

In my case...

1) Numerous 4,000-word articles with no aff links on numerous sub-topics from my main topic are at all at 950. All from different sub-directories, too.

2) No links ever bought in 11 years. No recips ever done in 11 years.

All long-tail searches have been sent to 950+.

The strange part...
I'm still at number 1 for my main group of travel searches which is the most competitively spammed-out category of anything I deal in. I rank number 1 for every permutation of [mycityname hotels, hotels in mycityname, etc] you can think of. One of my pages that also still ranks number 2 for several of those [mycityname hotels] searches contains about 80 aff links.

Bottom line is, I've lost 80% of my traffic, yet I've only lost 15% of revenue. (mostly Adsense revenue from long-tails)

I don't think this is a "penalty" at all. I think it's the by-product of an algo change and it's here to stay.


 6:55 pm on Mar 22, 2007 (gmt 0)

Hi Tedster, I've never bought any links, so theory #2 doesn't apply to me. As for the Thin Affiliate page... Maybe. The thin description would apply to the page that got hit. There are no, and never have been any affiliate links off of that page, or off of other pages on the site. I sell my own products on the site. I do have links to 1ShoppingCart order pages elsewhere in the site, and those "could" look like affiliate links, but there aren't any on the page that got hit.


 7:03 pm on Mar 22, 2007 (gmt 0)

I see tons of other decent sites with pages about these keywords down there in the 900s with me

How about the sites that are still at the top for those keywords -- anything stand out as obviously different?


 8:03 pm on Mar 22, 2007 (gmt 0)

It could be that they are applying some sort of sandbox effect to long tail searches because long tail searches are bypassing the sandbox effect in some cases.

[edited by: SEOPTI at 8:04 pm (utc) on Mar. 22, 2007]


 8:18 pm on Mar 22, 2007 (gmt 0)

I've searched:

site:domain.com (shows 100% URLs)

site:www.domain.com (shows just 17% URLs)

Is it bad or am I missing something?

The site is defined as www.domain.com, so if you write anything without "www" it loads the proper version and changes the URL.


 8:48 pm on Mar 22, 2007 (gmt 0)

This has nothing to do with 950+, their site command is simply broken, it has always been.


 9:16 pm on Mar 22, 2007 (gmt 0)

Then what if your SERPS seem to be back (since earlier this month) but you still get 70% of visits of what it used to be.

Before this nonsense started we never used to have the same daily stats, not to talk about weekly stats, I think due a natural random factor. Now it's like you can draw the same line (as when we're deep down) over and over again.

I mean something like:

Before (good old days):

16 14 17 19 16 15


6 6 6 6 6 6 6


10 10 10 10 10 10


 9:22 pm on Mar 22, 2007 (gmt 0)

it loads the proper version and changes the URL.

By what mechanism? Anything but a 301 redirect can create canonical issues.

I don't think this is a "penalty" at all. I think it's the by-product of an algo change and it's here to stay.

I'm tending to think the same thing.

There seems to be a re-ranking of the "core algorithm's" original results, taken as a last step. I can almost smell the re-ranking mathematics all over it.

For instance, note that the drop is not a consistent number of places, the way you might expect with a conventional "penalty." Instead, the url always falls to near the end of however many results are generated before the "omitted results" link. It feels to me as if the original relevance factor is being multipled by some percentage, but the url still stays in the primary index - no new urls are being pulled into the results list -- and sometimes a "penalized" url can even rank well for another search term.

Also note reports of popping from the end to the top and back, as if some threshold measurement is being tweaked and sometimes the page is over and sometimes under.

But what seem quite unclear is the factor or factors that trip this re-ranking. Some people have worked with de-optimizing on-page factors and anchor text and seen improvements. One case I know of seems to have stably escaped the penalty after just one IBL was added that INCLUDED the exact target phrase in anchor text -- in other words, an extra SEO factor was added, not de-optimized.

Other examples I've looked at show no obvious "over-optimization" at all, so there's nowhere obvious to de-optimize anything.

Add item 3, Bounce rate.

That's a very interesting idea, trinorth. What makes you think this? Many of the pages I'm seeing dropped to the end-of-results would have made me a very happy searcher if they were still a top result. Even if the bounce rate for these pages were a bit high, why such a harsh reaction? Might Google be using flawed data for bounce rate calculations?

This is the most puzzling part for me. What is Google trying to do that would inflict such extreme damage on what seem to be quite relevant pages? Are they just in the early stages of working in some new algo factor and having problems with it?

There certainly could be several different "tests" being applied, each of which might result in an extreme drop. I doubt that any or all of the tested factors would be simplistic - every such simple theory (like title=H1) has obvious counter-examples.

If the top ten were nice and clean on these searches, then just maybe Google would feel that such collateral damage was acceptable, at least for now. But the top ten don't always look clean and relevant to me on these searches - so why throw away what were formerly decent to good results? We can see that this is happening, but that's not enough to fully understand it.


 9:25 pm on Mar 22, 2007 (gmt 0)

The only problem with theories 1,2,3 & 4 is that a lot of sites or pages have been hit that don't have any of these factors. I'm not saying that Google isn't trying to weed out those kinds of pages but they are missing the mark a lot.

On links to affiliates, I don't see any difference between pages that have them and pages that don't. I just don't think it's a factor.


 10:06 pm on Mar 22, 2007 (gmt 0)


I noticed this on one of our sites. We were ranking number one for the keyword "neon light kits". Google recently pushed us way down in the serps for that keyword. That site sells an alternative to neon lighting for buildings or signs. The landing page was good, but people were hitting the site hard on that keyword "neon light kits" and only viewing that page. We suspected that they were hitting the back button and going to the next result.

So we did some digging, actually had a customer call about the products that the site was selling and he made this comment "You guys have some neat lights, I was looking for a light kit for my dodge neon car and I found you"

Bingo, that told us that on that specific term "neon light kits" that the majority of searchers were looking for Dodge Neon Car light kits and not the product we were selling which makes sense.

Google obviously figured out the majority of people searching that term were looking for something different.


 10:13 pm on Mar 22, 2007 (gmt 0)

Also, notice how some people talk about decreased traffic, but increased conversions. Google might be doing a better job in matching search terms to conversion data.

If a site has adsense or uses google analytics, google has great insight on all the traffic and how users flow through the site.

You read threads on here about people putting analytics on their site, then all the sudden crash in the serps. If google only had limited access to information before the site had analytics, google had less data to judge the site on for serp placement.

The person adds adsense or analytics to the entire site or part of it. Google now knows all the information.

So pre anaylics, google might not have seen the 80% bounce rate. But now it is.... Thus the fall in serps...

I would also say that average page views plays a role in the algo as well.


 10:57 pm on Mar 22, 2007 (gmt 0)

it loads the proper version and changes the URL.

By what mechanism? Anything but a 301 redirect can create canonical issues.

RewriteCond %{HTTP_HOST} ^mysite\.com$ [NC]
RewriteRule ^(.*)$ [mysite.com...] [R=301,L]


 11:14 pm on Mar 22, 2007 (gmt 0)

Also note reports of popping from the end to the top and back, as if some threshold measurement is being tweaked and sometimes the page is over and sometimes under.

Now I am seeing this happen with my page. It's back down at 950+ something today - about 4 positions away from the "end" of the results.

This page was doing well, usually top 3-5 for a very competitive 2-word phras for years. Then in January and February it started dropping for the first time in years. By mid-February it was on page 2 instead of page 1. Yesterday when it "came back" it was at #16, which was about where it was right before it go the 950 penalty around the end of February.

It does seem to me that some dial is being tweaked and this page is right on the edge.


 11:42 pm on Mar 22, 2007 (gmt 0)

no i dont think so....its more like a sift effect....unlike an algo that kicks in and is done..a sift just goes through pages like a ripple effect..it doesnt effect all page at the same time....


 10:41 pm on Mar 23, 2007 (gmt 0)

I've got hit with this issue three times:
June 27
At some point in January and three days ago, March 21st.
Now I'm still at 950+.
On both previous incidents came back a few weeks afterwards without doing anything.
My site is in Spanish and is a directory, lots of companies in different industries. Lots of internal linking. People get in , find the info they want and leave quite early... because they found what they want!... a phone number, a name, an address.
I can't believe Google assumes that a site is useless because people leave the site early. They leave early because they got what they want. People have a life, they want to do things in their real life. I can't believe Google assumes that people with a normal life HAVE to spend hours and hours in front of a screen! Its simple reasoning. I really doubt Google is applying this type of criteria.


 12:07 am on Mar 24, 2007 (gmt 0)

carlitos: Do you run a local directory? With some local sites the bounce rate is quite high (80% and more).

[edited by: SEOPTI at 12:07 am (utc) on Mar. 24, 2007]


 5:21 pm on Mar 24, 2007 (gmt 0)

Also, notice how some people talk about decreased traffic, but increased conversions.

Could that be a matter of getting fewer long-tail leads?

This 176 message thread spans 6 pages: < < 176 ( 1 2 [3] 4 5 6 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved