homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 199 message thread spans 7 pages: < < 199 ( 1 [2] 3 4 5 6 7 > >     
Google's 950 Penalty - Part 12

 10:24 am on Oct 31, 2007 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

It is totally nonsense for me to worry about TBPR when I was badly hitted from a -950 penalty (look! my PR raised in almost all pages ... but as I said who cares?).

Please all -950ers come here and join this thread to group possible causes.

Here are mine:

1) E-mail to Adsense team about an account creation with domain name
2) Too many adsense boxes
3) Midly Over-optimized pages
4) Too similar titles
5) Some directory links (as almost all my competitors though)

I add that in last months no big changes were done!

Join -950ers power :-)

[edited by: tedster at 9:08 pm (utc) on Feb. 27, 2008]



 6:30 am on Nov 5, 2007 (gmt 0)

I've had three sites get out of 950, and have altered all my sites on what I think worked for me. Of course, you can never be 100% sure, but these are rules I now live by:

1. Unique titles for every page. Never, never use a scheme like red widgets, blue widgets, large widgets... or (perhaps even worse) Chicago widget dealers, Dallas widget dealers, etc, etc. In my book, these are a sure ticket to 950 hell. The more words that are repeated (widgets, widget dealers), the worse will be your sin.

2. Absolutly unique meta description tags for every page, with nice long phrases that use most of the (190?) character limit. Doesn't matter how much time it takes, its one of the most important factors. I would NEVER skip including a meta description tag. (The fact Google picks up snippets to display in the serps has absolutly nothing to do with this issue.) If you don't go the unique meta descrip route, I don't think you have a legit gripe about being 950'd. You'll just have to live off long tail searches.

3. Internal linking that repeats keywords: Don't crosslink the red widgets page to the blue widgets page, and vice-versa. Send the user on the red widgets page back to a higher level index page. Place the linking to red widgets, blue widgets, etc ONLY on that page. I believe this problem is even greater with geo-based linking (Chicago widget dealers, Dallas widget dealers, etc, etc.)

4. On-page anchor links with keywords: Do you have long pages with a top menu that anchors to a number of products or items to be found on that page? If you do, never, never build those anchor links with a repetitive keyword chain like I described above.

For example, I'm sure you've seen pages where the topic might be a city, say, and the anchor links go like this: Atlanta restaurants, Atlanta theatres, Atlanta hospitals. From now on, dump the word "Atlanta". My example is overly simple. In real life, you will have to look at your work carefully for less obvious repetitions.

These are not new revelations... everything has been discussed earlier in this miles-long thread. You can find great supporting threads that address specific issues like meta description tags and titles. It just takes hours and hours of looking and reading.

If anyone needs to know more about my suggestions, I'd be happy to explain further. Remember, this is just one person's experience.

Good luck.


 10:23 am on Nov 5, 2007 (gmt 0)

thanks for the tips dibbern2,

once these issues were addressed, how long did it take for rankings to return back to normal? did you experience any serp movements before fixing things, etc?


 4:51 pm on Nov 5, 2007 (gmt 0)

Virtually no serp movement before fixing, at least in my case.

Two sites came back quickly, in a week or so. The third took several months. I think the 3rd had more troubles than the first two.


 5:46 pm on Nov 5, 2007 (gmt 0)

Today, I'm back in SERPs.

Here what I did and noticed:


1) Removed almost all adsense boxes but one.
2) Cutted off titles (max len is now 170 chrs)
3) Asked to remove some "strange" links that should sound spammy or paid
4) Removed all references to "link exchange" in my domain
5) Did a reinclusion request from WMT


I went out from serps about one week ago and from what I can see the best point is that TITLEs are refreshing in google index and now they aren't all that similar (boston red widget, los angeles red widgets ..). In fact I can see many cached pages identical to yesterday.

I'll go on to check what it is the real action that helped (if there was) and I'll post it here.

Good luck to all.


 6:17 pm on Nov 5, 2007 (gmt 0)

3) Asked to remove some "strange" links that should sound spammy or paid

Sites linking to you or you linking to them?

Given the # of obviously paid links/counter spam raising sites in the SERPs it's hard to imagine that Google is penalizing for inbound links.


 6:55 pm on Nov 5, 2007 (gmt 0)

I went out from serps about one week ago

So you dropped out, made some changes, and came back all within one week? I'm sorry, but I seriously doubt any cause/effect can be determined in this case.

Many sites bounce in and out. If you were out of the serps for months, then made some changes that seemed to help bring you back it might have some meaning.


 7:12 pm on Nov 5, 2007 (gmt 0)

Reinclusion request for -950? Strange, I don't think this will work but of course Google could go ahead tell webmasters if it makes sense to do it.

Come on Google please give us some help.

[edited by: SEOPTI at 7:13 pm (utc) on Nov. 5, 2007]


 9:58 pm on Nov 5, 2007 (gmt 0)


Sites linking to me.


 10:02 pm on Nov 5, 2007 (gmt 0)


You may be right. I can only say what I can see. If things are as I imagine, in about 3-4 weeks I can be in 950 box again.

Also, all things continuously evolve so it is hard to check all possible solutions and to be sure are right.

Good luck!


 5:27 am on Nov 6, 2007 (gmt 0)

Has anyone here tested two 950'd sites, sending a reinclusion request for one but not the other? Is there any time advantage?

Do we know if reinclusion requests are manual? Someone speculated a while back that penalties are automated, whereas reinclusions are manual.

Some folks here have said they made changes and got out from under the 950 in a few days. I doubt Google staff process requests that fast. Its own site, moreover, says to allow weeks.

So I'm guessing no reinclusion request is required; I just wonder if in some cases it could lift the 950 penalty faster. Perhaps it is now automated, so Google will recheck reinclusion requests to see if they are now in compliance?



 5:38 am on Nov 6, 2007 (gmt 0)

The penalty is a recalculation of position based on the factors most people have been discussing here.

Within reason, changing those factors will likely lift the penalty all things considered. A reinclusion request does nothing if the factors are not removed because it is likely the penalty is calculated algorithmically so unless the factors are removed you will simply end up back in the dumps.

Based on the information many people have suggested, there are some clear factors involved with this penalty and pretty clear suggestions on how you can get out of it.


 12:35 pm on Nov 6, 2007 (gmt 0)

Yeah, right. Of course I meant after fixing the problem(s).

The question is really about Google's attitude to offenders. It could say, 'They tried to trick us in an awful way. Those scheming little #$%$%! We won't bother to check if they cleaned up their act for six months.'

Or it could say, 'They were hit hard with a 950. We'll be kind and check once a week and see if they're back on track with white hat techniques.'

Or, 'They could lose their business if they don't get their SERPs back fast; we'll return every day and check out their site.'

The issue is really all about the bot. In another thread someone said after they got a penalty the bot stopped coming back.

It's easy to see why Google is not going to be keen to return to a site it just penalized.

You'd hope a reinclusion req would reset the bot to return.

Has anyone checked their logs and drawn any conclusions? I've read a lot of speculation on this board. How about some cold hard data?



 2:04 pm on Nov 6, 2007 (gmt 0)

The bot definitely hasnt stopped coming back, but without doing a comprehensive log analysis, I could certainly say that the crawl rate has dropped significantly.

Regarding the time to come back, in previous parts of this thread, the general consensus was that a 950d site came back without 1-2 weeks of major issues being addressed.


 7:03 pm on Nov 6, 2007 (gmt 0)

The cold hard data is that changes to those areas of your site will cause the site to return to better positions regardless of crawl rate. My general experience with two sites is that this happens within weeks. You would know if those changes work by looking at movements in the SERP's. I would start with the suggestions people have made on this thread - unique titles, useful navigation that is not heavy keyword heavy for the sake of being keyword heavy, unique meta descriptions.

Think of a flooded car. If you keep giving it gas, it will never start, and movement is 0

If you simply take your foot off the gas for a bit and stop feeding it rich petroleum, the car will cease to be flooded and will start again and movement will happen.


 8:18 pm on Nov 6, 2007 (gmt 0)

Hello All:
I must have goofed on a posting this weekend.

My site was back in the top 5 Saturday early afternoon.

What I did (and sorry, I did panic and did all suggestions offered or that I concluded from postings and similar situations):

1.) removed page titles that I changed to contain kewords (the keywords for which our site lost search ranking)

2.) submitted a request for Google to review my site; not a request for re-inclusion.

3.) reduced keyword density on homepage that I had increased; it was well written and not obvious keyword cramming.

4.) removed recent outward links that came up funky in Xenu Link Sleuth.

5.) removed award link that I did not update for 2007; these are customer voted awards that you are invited to claim and post as a picture link on your website.

We are still in the top 5 on one keyword; on the other we are sitting at 12...it did use to be a bit higher.

Thankyou everyone for your help.

I will keep watching for similar stories or correlations. The best ending would be to find the cause or causes...and the solution.


 11:20 pm on Nov 6, 2007 (gmt 0)


3) Asked to remove some "strange" links that should sound spammy or paid

you asked for them to remove links to you... or pages of your site?

I have a bunch of spammy scrapers that have links to me, i've written them asking to stop, but go luck. Can we ask google to disregard links to us from a specific site? That would be nice.


 12:44 am on Nov 7, 2007 (gmt 0)

I have a bunch of spammy scrapers that have links to me, i've written them asking to stop, but go luck. Can we ask google to disregard links to us from a specific site? That would be nice.

This is my point too. My worst looking links are scraper sites, by far. Of course, there are bloggers out there just linking to 10-15 pages of my site in their posts for no good reason at all, but why should I care what they are doing? It's unreasonable to expect that webmasters are going to police each and every inbound link they get.

I'd also list my "yahoo answers" links as some spammy looking links too. I know they are no-followed, but they do appear in GWT so I get to see them.


 4:59 am on Nov 7, 2007 (gmt 0)

I doubt that inbound links from scraper sites is a factor in the -950 phenomenon. And as long as your overall backlink profile is healthy and natural for your market, then I can't see those links hurting you, no matter what.

Now if your backlink profile is otherwise weak, then maybe scraper site links could have a negative impact. But that's still a pretty weak "maybe" I'm giving the idea.


 6:49 am on Nov 7, 2007 (gmt 0)

Escaped the 950 today! Yay! At least for one site section i.e., example.com/widgets/. (I've not checked the others yet.) Not at the bottom of the pit, but not back to previous SERPS for the relevant keywords, either. That may come in time (let's hope).

No reinclusion request. (I was going to wait till entire site changes were completed before seriously thinking about that.) Saw a bump in traffic today, so checked latest SERPs, and was pleasantly surprised.

Google cache shows changes made to beat 950. What changes?

1) removed all duplicate content from internal linking structure (i.e., same keywords from all pages interlinking to all other pages). This meant truncating navigation links from the full anchor text (matching page titles) to one/two/three unique words.

2) left all page titles and meta tags as they were.

3) removed all alt-tags from image icons (next to text links), but kept the links for the icons.

4) removed some related content links (to other parts of the site) from the bottom of the index page.

5) broke up a mini "link farm" (three or four thin/underdeveloped sites which were interlinked and 950'd), incl. links to the directory index page. All footer links removed.



 9:14 am on Nov 7, 2007 (gmt 0)

1) removed all duplicate content from internal linking structure (i.e., same keywords from all pages interlinking to all other pages). This meant truncating navigation links from the full anchor text (matching page titles) to one/two/three unique words.

Would you mind explaining this a bit further. Been stuck in this mess for a long time. I am looking strongly at this as part of it.

We are working on a redesign and cut our navigation keyword repetition down. More natural if you will.

So rather than red widgets, blue widgets, etc. It is red, blue, green, etc. In that section called widgets. The red link points to a page with the title blah blah red widget (or whatever). The H1 tag is different from the title. But typically - not always - the keyword combination is repeated. So the title may be something something red widget. The h1 tag would be "something red widget something something."

The red page links to our articles with brief descriptions. These links use the title of the article naturally. It is the same text as the meta title and the article h1 tags. These articles seem to have the most problems. I wouldn't think this is a problem, but who knows.

We have related articles and discussions at the bottom of our article pages. We only list a few of the newest articles out of the red (widget) section.

Our left navigation menu only lists a drill-down. So Home Page -> Widgets -> Red -> (further for deeper sections). All sections are like this. It is like a breadcrumb just in the left menu. So far cutting the navigation seemed to have little effect.

An that is about the extent of it. Very simple site structure. I just can't see anywhere on site to look. I thought maybe extend the navigation to other parts of the directory such as blue, green, etc. I know our visitors would move around more on site but again...don't know.

We have been around for 7 years and enjoyed great rankings. With those rankings we enjoyed some wonderful scraping, article stealers, and even whole site duplication (took down most all of the rip-offs but still get more every week to deal with.) I just don't know where else to look.

Offsite...well more links couldn't hurt. Or can they...lol.

Doing searches puts me at the end of results page. What is funny is that almost always it shows our own pages LINKING to the most relevant page. Such as using the exact article title. Rather than the article it is our directory pages and even articles that link to it that is displayed. Even when our listing is NOT at the end of the results. Sometimes there are two listings split apart. If I search a certain directory title the relevant page is at the end of results and an article that links to it is about 500 or so. Anyone know why this is? Gets real funky sometimes.


 11:24 am on Nov 7, 2007 (gmt 0)

Hi all,

As I mentioned in another thread recently, I've lost a huge amount of Google traffic (95%+) to one of my sites. How do I find out for sure if I am 950'd? My site is listed if I do a "site:mydomain.tld".

Also, is there any point in contacting Google about this. Do I have a realistic change of getting any kind of non-automated reply?

Thanks all


 2:27 pm on Nov 7, 2007 (gmt 0)

Cuttin down internal anchor text definitely helps. I use str_replace function for this.


 9:40 pm on Nov 7, 2007 (gmt 0)

Some of my sites were -31 and went to -950, many of them escaped on their own or by adding backlinks and fresh content. I have 2 sites left that seem to be stuck in some other type of penalty. No matter what I do nothing seems to work on them after a year of trying. WMT either shows no keywords or keywords with the site hovering between 30-190.

I got fed up with asking google for a reprieve and just banned googlebot from 95% of these sites (everything except the sales pages). I figure that those PR 5 "content pages" with "Googlebot noindex" tags might toss up a flag on their end someday. Until then, no google user should get the benefit of my content even if it's on page 4. I can't wait till pubcon ;)


 9:43 pm on Nov 7, 2007 (gmt 0)

Cuttin down internal anchor text definitely helps. I use str_replace function for this.

I decided to give this a shot today.

I've always run my navigation like:

blue widget1
red widget1
green widget1

now I'm


(keeping the page titles the same, and the h1 on the page the same as before - red widget1)

Will see how that goes. My navigation will take a little getting used to now, but its easier to find what I'm lookin for there without the KW repetition.


 6:28 am on Nov 8, 2007 (gmt 0)


I asked to remove links ON their site.



 3:09 pm on Nov 10, 2007 (gmt 0)

They are tweaking their -950 filters again today. It seems to me they are playing with the co-occurrance of phrases, but it could also be a relevancy tweak for incoming links. Still testing ...

[edited by: SEOPTI at 3:18 pm (utc) on Nov. 10, 2007]


 6:57 pm on Nov 10, 2007 (gmt 0)

We need to be careful not to overreact in trying to get the 950 penalty lifted. We have to find the balance between changing enough content to get it lifted, but not so much that we are unable to return to original SERPS.

The fact that Google is so quiet (and vague when it isn't quiet--it's an overoptimization penalty, says Matt Cutts); coupled with the fact nobody really understands this penalty very well; plus it's extreme nature, makes it easy to overreact.

Surely we don't need to throw out everything we learned about optimization? Does anyone know from testing what you don't need to edit to get out from the 950 and back to original SERPs?



 7:10 pm on Nov 10, 2007 (gmt 0)

Here is what I have found:

1. You don't need to stop building links to your website, in fact I found that one or two really high quality links with the keyword phrase can often help tip the scales in the right direction.

2. You don't need to remove your left hand nav or substitute words that do not make sense. Most of the -950 penalties I have seen on any sites had what could be deemed as 'overuse' of left hand navigation keywords and repetition.

3. You might want to consider not having both left hand nav and footer links that are optimized pointing in to the website, but you should not remove all links if they help visitors find what they need.

4. You don't need to remove H1 tags, simply make them read more natural and not copy the link anchor text verbatim.

5. You don't need to remove meta description tags, but each one should be unique and well written as an 'intro' to what he page is about.

6. Most of all, don't pull your hair out and document all changes. Watch for cache dates to ensure that the changes you do make are reflected in the SERP's. Without documenting each change in a file with dates, you are throwing seed to the wind.


 8:22 pm on Nov 10, 2007 (gmt 0)

Very sane advice, CainIV. There's no one action or set of specific actions that will always work to lift the -950.

I brought up a connection between the -950 the phrase based patents in Feb 2007 [webmasterworld.com]. Everything I've seen since then has confirmed for me that there is a mechanism in play that is based on the logical processes that those "phrase based indexing" patents describe - most especially spam detection patent [webmasterworld.com].

Key points that can make this -950 such a bear to "fix":

1. The re-ranking is triggered by crossing a threshold.

2. The threshold can be different for different search terms.

3. The threshold can be different for different markets or website taxonomies.

4. The threshold is set by measuring and combining many different types of
mark-up and grammatically related factors, not by absolutely measuring
any one factor.

5. The threshold is NOT set absolutely across all web documents. So phrases
in the travel space can be held to a different measure than, say, phrases
in jewelry e-commerce.

The patents suggest scoring all kinds of areas, for example:

"[0042] ...grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks."

"[0133] ...whether the occurrence is a title, bold, a heading, in a URL, in the body, in a sidebar, in a footer, in an advertisement, capitalized, or in some other type of HTML markup." Note that measurements are suggested here for position on the page.

I doubt that anyone is participating in this thread because they are penalized for autogenerated content that mashes up Moby Dick paragraphs with scraped content. What has happened is that, somehow or other, your pages have gone beyond the spam detection threshold that is currently set, so that we might now call those pages "over-optimized". (Quite the oxymoron there!)

Going wild with a "de-optimization" effort could deflate your pages to the point where they NATURALLY should rank at 950! So use a gentle touch, record your changes - and know that if you are just barely over some threshold then it might not take much to move you back. Also the threshold will be re-calibrated from time to time and you might "pop out" without doing anything at all, or for reasons that are not related to the changes you did make.

I know that these patents are challenging to read. I also know they are rather general - they point more to a type of process rather than nailing down any specifics. Still, I've found them quite illuminating and worth the effort. They not only describe spam detection (or over-optimization detection) but they suggest a way of thinking about what Google needs and doesn't need for strong relevance signals.


 7:28 pm on Nov 11, 2007 (gmt 0)

They are tweaking their -950 filters again today. It seems to me they are playing with the co-occurrance of phrases

I'm seeing the same thing today. By looking at the keywords hit it appears to be the co-occurance of phrases on a page.


 7:47 pm on Nov 11, 2007 (gmt 0)

Ted, excellent post as usual. I often think the most successful people in our field take the time to read, review and hypothesize on just these types of patents.

And really what you are alluding to speaks of the types of properties of any given document and the WAY in which Google might see your document.

This 199 message thread spans 7 pages: < < 199 ( 1 [2] 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved