homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 201 message thread spans 7 pages: < < 201 ( 1 [2] 3 4 5 6 7 > >     
Google's 950 Penalty
What do we know about it, and how do we get out of it?

 2:05 pm on Jan 11, 2007 (gmt 0)

I've read a lot about Google's -30 Penalty, where pages on a site drop 30 positions, but most of the comments about the 950 Penalty, where pages on a site drop to the very bottom or last page of the search results, have been comments made in other topic threads.

What do we really know about this penalty, what causes it, and most important of all, how do we fix our sites to restore normal rankings?



 6:20 am on Jan 14, 2007 (gmt 0)

I wonder if it isn't an issue with linking to affiliate programs. That's our primary income source. And no my content isn't just rehashed content either. It's content that's been developed for years.

It's possible that affiliate programs could be a factor if the affiliate tail is wagging the content dog, but I've got affiliate links on most pages and haven't noticed any significant ranking changes for better or worse.

The explanation is likely to be more complex than whether a site has affiliate links.


 8:38 am on Jan 14, 2007 (gmt 0)

How about;

1) SEO 2006 = certain keyword density and placement to bring you right up to but not over the line. Maybe even linking / architecture strategies too.

2) Google thinks "Hmmmm, certain areas are full of SEO's.

3) Google moves the line for some search terms in 2007.

I am pretty sure it is keyword density / architecture not site (domain) though. I have one site, two main parts (in two competitive search areas known to attract Google's attention). One part I decide to change the format a little to bring certain pages closer to the section root (I would say index but the index page of both my domain and section are not as highly ranked as a certain page on each which people naturally link to); I did this to overcome a problem with Google last year! Now search term 2 shows section 1 (it looks kind of odd in the search returns so I had to change section 1's page a little and put a link "Click here for Section 2" right at the top of the main page for section 1. BTW, section 1 also ranks #1 for search term 1. Section 2 used to rank at #1 for search term 2, but now section 1 does and section 2 is not even shown!

It is like I have a widgets site with two sections, ship widgets and house widgets (I used this differentiation as I think blue and red widgets does not show the huge gap between the two sections well enough). Now someone searching for "house widgets" gets "Don't let your boat sink, get your boat widgets here" in the Google SERPs.

I thought putting the link to section 2 at the top of main page section 1 might actually hold the key to having the more logical and prefered search results return in Google, but it did not. Shortly I will have to redo the site architevcture for section 2 again as clearly this is the issue. BTW, it is actually a non-profit site.


 1:02 am on Jan 15, 2007 (gmt 0)

The explanation is likely to be more complex than whether a site has affiliate links.

My site has no affiliate links, and never has.

Meaning, I agree with this observation.


 2:31 am on Jan 15, 2007 (gmt 0)

Where is the first part of this thead.


 3:31 am on Jan 15, 2007 (gmt 0)

I am pretty sure it is keyword density..

I agree. I just reduced it by 20% for the terms penalized. Now, to get Googlebot to return.


 6:01 am on Jan 15, 2007 (gmt 0)

A couple of thoughts, observations...

First, I don't think the "950" is really accurate... it's more like "end of results". For some reason for some of my keywords with over 100 million results, I can only page down to result 750 ish... of course in the last 50 results I find my former top 10 site.

Second, I suggest that you go to the last 50 results of your niche, and I think you will find a couple of old neighbors down there as well. In my instance, for one of my extremely competitive terms, I am joined down there by 4 historic top 20 sites for my niche.

Looking at those sites, there is nothing in common. 2 of us use affiliate links, both with redirects. The other 3 have no affiliate links whatsoever (they are some of the primary industry leaders the others of us link to through affilate links)

A couple of us have moderate keyword density, others do not. Some of us are small, independent sites... another is the niche site of General Motors.

It is more or less page specific, rather than site-wide. In fact, this was the primary page for our niche of each of us now at the bottom. Yet we now all have other pages that rank (not well...like around 100) for other pages in this niche or our homepages on the keyword where the "penalized" page used to rank top 20.

So, it's almost like Google has replaced our main pages for this niche with alternative pages... which don't rank as well because they aren't optimized or backlinked to for these keywords.

All penallized pages are internal pages... except the General Motors page, which is the homepage (of the niche, not the gm dot com page.

I am totally lost on this one. Was thinking it was affiliate related or OOP of some sort, but the other fallen pages don't fit a trend. Take a look down there for your old top 10 neighbors and post your observations.


 8:47 am on Jan 15, 2007 (gmt 0)

"A couple of us have moderate keyword density"

Moderate keyword density by whose measure? Keyword density where?

I reckon I have already given away a lead to the "magic sauce". When you think about it, it is quite logical. What do webmasters / wanabe SEO's know about getting rankings from Google? So where would Google look at to see if there was an unnatural footprint associated with this wanabe seo vs. normal web sites? It clearly is a filter, not a penalty also.


 1:22 pm on Jan 15, 2007 (gmt 0)

Moderate, meaning the density on my page is about 4%, none above 6%, and two pages use the KW's exactly once, resulting in a density under 1%.


 1:43 pm on Jan 15, 2007 (gmt 0)


I am looking top ten pages on Google with keyword density of 14%. I was top ten with a keyword density of 3%, and now, I dont know where is my ALL my pages....



 3:37 pm on Jan 15, 2007 (gmt 0)

It seems like there are 2 types of issues in this thread.

One where some of the keyword phrases drop and the other where they all dropped.

In the case where some of the phrases dropped: What is the common problem with just the pages involved?

In the case where they all dropped: the problem is deeper. In our case it seemed that Google found one problem then they seemed to be more critical of other issues, therefor bring the whole site down. The first three things I would look at are:
1. Duplicated title phrases. This was the primary suspected culprit with us.
2. If your site: operator doesn't show your most important page first, I would look at validation issues where Google couldn't follow an important page.
3. In our case 302 hijacking was part of the problem. If any other sites show up with 302's to your home page (inurl:yoursite.com -site:yoursite.com) will give you a list). This isn't as likely now as it was 6 months ago based on all of the thread inputs I have seen.

Good luck.

[edited by: tedster at 5:28 pm (utc) on Jan. 15, 2007]
[edit reason] remove accidental smile face [/edit]


 3:58 pm on Jan 15, 2007 (gmt 0)

First, I don't think the "950" is really accurate... it's more like "end of results".

Good observation. If you set your preferences to 100 results per page, then search for your site using your main keyword(s), you will be found on the last page. However typically that only gets you to around # 700. At the bottom of that page is, “repeat the search with the omitted results included.” Click that and now your at the bottom of the tenth page.

So yes, you’re right; it’s the “End of the Results” penalty. Begs the question for this particular penalty/filter, why the end?

Second, I suggest that you go to the last 50 results of your niche, and I think you will find a couple of old neighbors down there as well.

Correct again, a long time established site, that’s a competitor of ours, sits today at the absolutely last position available; # 1,000. This site has been hanging around the first three pages for 7 years.

It clearly is a filter, not a penalty also.

Could you elaborate on this, typically you tend to think of a filter as something that blocks movement forward, and a penalty as something that sends you back.

Is this filter/penalty on-page generated or off page? If it is on-page, I guess anything is possible. But key word density as the culprit?


 4:44 pm on Jan 15, 2007 (gmt 0)

Matt Cutts, at Pubcon, said this about a penalized site:

"when I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so."


I have a few sites that got hit by the -30 penalty. I wasn't doing any on-page blackhat stuff, but I was getting pretty aggressive with link-building techniques.

I'm pretty sure the penalty was a result of the combination of private whois data, overly-aggressive link-building (and lower-quality links), and maybe the PPC-aspect of the sites.

According to what Matt said at the conference, they do seem to be looking at some aspect of your advertising to determine the quality of your content. For example, if you have 3 336x250 blocks of ads in the main body of the page, it could set off some kind of filter.


 5:09 pm on Jan 15, 2007 (gmt 0)

First, I don't think the "950" is really accurate... it's more like "end of results".

I chose that term because it had been used in several other previous threads, by long time members, as descriptive of this action. And when webmasters check keywords, they often set the default to 100 per page, at least I know I do, so depending how competitive the search term is, you can get to 700, 800, or more results. The "950 Penalty" doesn't mean specifically that your site shows up at #950, it just represents the bottom of the search results, whatever that number may be. Theoretically, Google returns 1,000 results for a search.

The site I've been concerned about appears to have come out of this penalty, or whatever it is. I'm going to see if it lasts before I get too excited about it, though.


 5:29 pm on Jan 15, 2007 (gmt 0)

My site was affected (~900 on home page and some other important pages) BUT I would like to confirm that home page recovered to a position where we were before this "action" and I hope that other pages will be recovered soon. Looks like it was due to index update. Time will show


 5:39 pm on Jan 15, 2007 (gmt 0)

AndyA did you anything special to your concerned site. I am 100% agree with end of result algo. I guess they have algo like this:

First they have found out the problematic url either through manual site review or a robot one and put it in some kind of List.

Second they already have normal search list but once that search list presented to requester they match it against these black list and remove from top results and put it at last.

Major reason could be following:

1) If you have any other site and having lot of crosslink between them then you may get this. ( It can be detected either through manual or robot )
2) Any kind of regular or boiler plate text at the bottom which is appearing every time. ( robot can detect )
3) Same kind of bottom or footer link everywhere. ( robot can detect )
4) I totally discount or will not accept that these things because of paid backlinks. See if this thing is part of algo then Microsoft or Yahoo should have paid 10000 of porn site owner to have a back link to Google or a wealthy competitor will do the same to put their competitor to rest in peace. ( This should not be even part of review because this is going to be a big bug if they go this route at all )
5) Overly optimization and not having good outgoing links on the subject matter. ( it can be manual as well as Robot )

I guess if we got penalized by a robot then we can get it fixed in one or two months but if we got through some kind of manual review then I am not sure how long it will take because if they have gone for manual review then they have not yet able to code that manual review logic in robot kind of code. So it will be difficult in getting over with this one.

If anybody has tried this please let me know their result but others point of view on this is greatly appreciated before I try to fix my affected sites.




 5:54 pm on Jan 15, 2007 (gmt 0)

I'm thinking these "950" results might be related to Local Rank calculation or something similar. Local Rank is keyword specific, and it throws out all the PR that comes from domains that are not also in the top 1000 results, so the final SERP is sort of a "jury of your peers" vote.

[edited by: tedster at 7:59 pm (utc) on Jan. 15, 2007]


 6:11 pm on Jan 15, 2007 (gmt 0)

Boy I wish we could post URL's here... not of my site, but of an example site that is getting hit... So here I will go with the "widgets" thing...

The General Motors niche homepage had historically ranked top 10 or so for a search for "buy red widget" and related searches. Frankly, it deserves to be there. Now the following results:
"Red Widget" - End of Results, formerly top 20
"buy red widget" - Top 10
"buy red widgets" - End of Results, formerly top 20
"buy 'red synonym' widget" - End of Results (where google previously allowed them to rank for an obvious synonym of "red" without any on page text of the synonym.
"'red synonym' widget" - Top 10

Again, this is all for the same homepage. In this instance the plural invokes the "penalty" and the singular still places top 10 results. Keyword "Buy Red Widget" appears exactly once on the page... no other instance of "red", "widget", or "buy" on the page. No affiliate links etc.

I have other examples of sites that rank for one variation of essentially the same search and used to rank for a second variation, and now are down at the bottom for the second variation.

For those of you who track competitors search positions daily... from early January to now, check which sites have left the top 20 for your most competitive terms... then check the bottom results (they will probably be there)... Then look at similar searches with slight variations (plural, synonym) and see if the same page still ranks for slight variations.


 8:01 pm on Jan 15, 2007 (gmt 0)

Okay guys. Now, during this current update, at least what appears to be at minimum a PR update, I have returned to the top position for one of my pages. I did dramatically reduced my KW density of this page. But was it due to the KW density change? I am not really sure.

However, the other page that was knocked out has yet to return, but I did not change the KW density on it. I am not going to change the KW density either. At least temporarily. This may give us some insight.

Other things I did change, which may or may not come into play in the future, but did effect both pages which suffered:
1. Robots.txt'ed the redirect page of my affiliate links.
2. Robots.txt'ed index.php?param=value - a page that was very similar to the main page, but whose intention is to direct the users to do an action.
3. 301'ed /dir/ to /dir/index.php - I know this is a little backwards from the norm, but I chose to use /dir/index.php as the main URL for all directories.

I will report back in the next week or so.


 8:56 pm on Jan 15, 2007 (gmt 0)

@ ron252:

All I have been doing is going back through adding unique meta descriptions, (making sure they are short enough to be included completely in the snippets), cleaning up the HTML code wherever I found issues (and there are a lot of issues with this site - it is older, circa 2000, and was originally built with a WYSIWYG Editor. As a result, there is a lot of unnecessary code, and I'm in the process of removing it, and slowly updating it to CSS and more current HTML standards.

I know Google is on record as stating we shouldn't worry about the code, as they strip it out anyway, but I was concerned to see multiple <FONT > tags separating the anchor text and the <a href > tags. I figured that had to be bad for any search engine, to try to decipher what exactly the anchor text was in between all the font size, font face, font colors, etc.

I did also upload a URL text map to Google. I figured it might help Google sort out what pages are mine and which ones aren't mine, in case there's anything going on in the form of 302 abuse. It would seem the map should help Google determine which pages DON'T belong to a particular site as well as which ones DO BELONG.

Of course I don't know if any of this is what corrected the problem. The site did sit dormant for about 2 years, with very few updates due to getting hit in November 2004 with some sort of penalty. I just lost interest in it, thinking it was a waste of time. E-mails from visitors asking about updates and telling me how much they love the site motivated me to work on it again, if only for them. I now believe my forum was to blame, which had been added in August 2004, and that I now know was generating tons of duplicate content. I disallowed the forum in robots.txt and those pages are slowly being deindexed. The number count of site pages in Webmaster Tools is now much closer to their actual number, so I know these duplicate pages were weighing the site as a whole down, likely messing up internal PR distribution, too, which caused some lower level pages to go supplemental.

I've always felt that Google overlooks some things, but when a site reaches a certain level of issues, it is impacted in the rankings. I envisioned every page I worked on having the ability to finally lower my site's penalty score enough to restore good rankings in the index.

I'm still not convinced this is a permanent change, due to the fluctuations of the index, but I'm still slowly making corrections so I hope this is enough to keep me out of trouble, since I never intentionally did anything wrong to start with.

[edited by: AndyA at 9:01 pm (utc) on Jan. 15, 2007]


 10:06 pm on Jan 15, 2007 (gmt 0)

I just saw that end of result penalty is going upwards means for my targetted KW "ABC" google list my site just above the end of results but just now it is showing in the middle. I guess we can not for sure say it is end of result penalty or something else. I guess we need to wait for some more time in order to fix some kind analogy. Guys wait some more time may be a week or so.




 12:55 am on Jan 16, 2007 (gmt 0)

Since this penalty has existed at least since September 23, 2005, I'd say 16 months is long enough to wait...

The penalty is plain enough. Occasionally for some terms you don't go all the way to the back, like a search for the entire title, or the title plus the H1 in quotes, but for the most part a non-spam page gets pinned in the last 100 results with other penalized pages.


 1:10 am on Jan 16, 2007 (gmt 0)

The penalty is plain enough

It sure is, spent some time today looking through positions 500 through 1,000 for key-words where I'm familiar with the usual players. Found a few old time, usually well ranking sites, that I was very surprised to see hanging around that neighborhood.

It’s a narrow band of sites being hit with this thing, but it’s wider than what you might initially think before doing some searching. Kind of surprised there’s not more posting regarding it, but some of the usually well ranking sites that have tanked are not aggressively SEO’d so maybe the owners are a bit oblivious just yet.

But, were back to the same questions; why the sudden increase in the occurrence of these penalties, and what’s causing them.


 1:29 am on Jan 16, 2007 (gmt 0)

It’s a narrow band of sites being hit with this thing, but it’s wider than what you might initially think before doing some searching. Kind of surprised there’s not more posting regarding it, but some of the usually well ranking sites that have tanked are not aggressively SEO’d so maybe the owners are a bit oblivious just yet.

But, were back to the same questions; why the sudden increase in the occurrence of these penalties, and what’s causing them.

well stated randle.

only because we are bussy with watching all this, we recognise it, people doing site:command think: ok, nothing wrong, i am still in....

and a sudden increase as well...


 1:45 am on Jan 16, 2007 (gmt 0)

"why the sudden increase in the occurrence of these penalties"

This may be true, but I wouldn't assume it.

Instead I think it is more like the hundreth monkey, or algae doubling in size each day as it grows on a pond... by the time you even notice it at all, the pond is three days from being completely covered.


 8:36 am on Jan 16, 2007 (gmt 0)

[quote]Kind of surprised there’s not more posting regarding it[quote]

With the amount of fluxes, refreshes and every other name applied to knob-turning at Google I'm not suprpised at all. Many webmaster may not even know this issue is there, especially if it doesnt hit their biggest keywords...

Suggestions and opinions:

1. Overuse of text or Seo onpage?
2. Overuse of the same keyword, as applied within an LSI frame?
3. Inbound links using that keyword
4. Bug
5. Other


 8:48 am on Jan 16, 2007 (gmt 0)

As reflected by Moncao in msg#:3218857 and jtoddv in msg#:3220138 of this thread, the penalty points to over optimization, especially with keyword saturation. I also thought this might be the cause of my site disappearing from the SERP for several search terms, so I took action.

I used one of the keyword density tools available through a search at our friend Google, and verified those same keywords/keyphrases. I then reduced keyword density by 20 to 30% for those search terms, and voila - back to normal ranking in SERP across all data centers. Took a little over 24 hours.


 8:52 am on Jan 16, 2007 (gmt 0)

We got hit with this 3 days ago.

I think our example will act as a 'control' in this experiment.

The site has been static in terms of on-site optimization for nearly 2 years - steady SERPS growth as offsite optimization continued. Reaching #1 for its chosen keywords. We had a dual strategy for keywords:

'Yellow Widget Green Knobs' on each major section

About 6 weeks ago my business partner aggressively began faster link text building offsite optimisation for 'Yellow Widget'. These pages are now #950. However 'Green Knobs' still ranks #1 for some terms but there has been a general loss - further down page 1 and often further.

Looks like some kind of 'over optimization' penalty - maybe too fast link growth or too narrow link texts.

1) Does anyone have similar experience?
2) Tempted to ask link partners to remove links. If someone has had same experience how long does it take to come out of sin-bin?

Thank you



 12:39 pm on Jan 16, 2007 (gmt 0)

I'm back to the bottom of the results with most of the keywords/phrases I monitor. The better rankings lasted about 48 hours. I'm going to reduce the keyword/keyphrase density on a couple of pages and see if that makes any difference.

When reading the page, it reads normally, so it doesn't come across as being overoptimized. Once again, I'm building my site for Google instead of my visitors.


 4:02 pm on Jan 16, 2007 (gmt 0)

I have a feeling Google is targeting high value business-related terms with this filter. Since we cannot list URLs here I think it would interesting for people to list some terms they aggressively compete for, or at the very least acknowledge that they have been hit with the filter AND they compete for high demand/high value business terms, not niche, or non-profit related, or obscure, or regional-specific terms

Its defenitely not a keyword density thing because we have moderate density and we have been hit. We have alot of rich and original content as we are a trusted free information resource, but we balance that with affiliate revenue and we do have alot of outbound affiliate links (albeit with _blank, no follow, contained in a JavaScript popup window) and we are competing for high value business terms.


 4:14 pm on Jan 16, 2007 (gmt 0)

A few more observations:

I don't think overoptimization is it. The site I have is underoptimized if anything. It was built as I said earlier with a WYSIWYG editor in 2000. No H1, H2 tags, nothing fancy in the way of optimization. I know it needs all of these things, but I'm afraid to do too much until I know what the problem is.

And right now, I'm not sure what it is. So, I'm cleaning up code that I know will have to be cleaned up, but not adding H1 tags, CSS, etc., until I have "fixed" the site.

This site is also not in a competitive area at all, it's definitely a niche site, so it would not be targeted because of being in a highly competitive area.

I can cut down on some keywords in the body of the pages, but others pages that are also hit with the "950 Penalty" do not have large amounts of keywords in the content, so I'm not sure what would be impacting those pages. There are still some supplemental pages as well, due I thought to no meta descriptions. But the new cache is showing updated and unique metas, yet the page is still listed as supplemental.

Is there a delay between fixing meta tags and the page leaving the supplemental index? I was under the impression it happened the next time the page was spidered, which certainly is not the case in this instance.


 6:06 pm on Jan 16, 2007 (gmt 0)

I used one of the keyword density tools available through a search at our friend Google, and verified those same keywords/keyphrases. I then reduced keyword density by 20 to 30% for those search terms, and voila - back to normal ranking in SERP across all data centers. Took a little over 24 hours.

Thats not a very controlled experiment, was the cache date on the page with reduced keywords new in Google? If not, then Google was most likely using your older page anyway. It is doubtful that that effect could be fixed in 24 hours by reducing on page text and seo. Any one else?

This 201 message thread spans 7 pages: < < 201 ( 1 [2] 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved