Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Google's 950 Penalty - Part 13

11:01 am on Feb 9, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Apr 26, 2006
votes: 0

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.

I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.

I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.

The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.


Keyword1 Keyword2 Keyword3 . . . Keyword9

But for each of the directories, i.e:


there is still repetition of the horizontal header nav link in the vertical menu:


Keyword1 Keyword2 Keyword3 . . . Keyword9

Keyword1 Widgets

I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"

Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!

That's just bad site structuring.


I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.

"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"

"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster

"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster

So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.

Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...

Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?

Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.

That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."

I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.

Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?


"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g

[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]

12:52 pm on June 26, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 9, 2005
votes: 0

when i look at my sector all big guns excessively use internal linking with keywords. They have up to 200 links per page, each of them full of keywords.
They all rank fine and seem to be better than our sites with limited internal linking.

I think the -950 penalty is about content not links. are you all 100% sure the content is all right?

2:37 pm on June 26, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

idolw, I'd say the -950 is about an entire range of factors. We were discussing what commonly trips the -950, and the mega-menu has been a common factor that I've seen - and a rethinking of the menu structure to make it shorter has often resulted in a fix.

But I never intended anyone to think this is the only way to trip a -950 or "over optimization" reranking. As I mentioned in the -950 summary thread [webmasterworld.com]:

The patents suggest scoring all kinds of areas, for example:

"[0042] ...grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks."

"[0133] ...whether the occurrence is a title, bold, a heading, in a URL, in the body, in a sidebar, in a footer, in an advertisement, capitalized, or in some other type of HTML markup." Note that measurements are suggested here for position on the page.

If it looks like the "big guns" get special treatment, it may be that positive trust factors can give a major site a relative level of immunity from the -950. This wasn't mentioned in the phrase-based patents that I remember, but it would certainly make sense. Could even account for the lower amount of false positives in recent times.

< contined here: [webmasterworld.com...] >

This 212 message thread spans 8 pages: 212