Forum Moderators: Robert Charlton & goodroi
I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.
I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.
I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.
The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.
e.g:
Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9
But for each of the directories, i.e:
http://www.example.com/keyword1/
there is still repetition of the horizontal header nav link in the vertical menu:
e.g:
Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9
Keyword1 Widgets
Red
White
Blue
...
I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"
Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!
That's just bad site structuring.
I HATE THIS 950 POS!
I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.
"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]
"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster
"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster
So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.
Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...
Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?
Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.
That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."
I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.
Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?
p/g
"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g
[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]
This first implies something greater than one word on one page, and that is all it can take to get one page 950ed.
Also, more importantly, there is no evidence to suggest that a 950 penalty can not occur from merely off-site factors... for example unnatural variance of external link anchor text to a page. So, it is not necessarily that Google saw something about a site. It is google saw something in its ranking algorithm about a page. That "something" could be off-domain, and might even be something a webmaster is completely unaware of.
I could not have said it any clearer. "Site" is used in generalization. If it is 1 page that is being penalized, that one page is a part of your site...what you are thinking of is "site wide" which I clearly did NOT say. "something about your site" is as broad as it gets and can also include offsite factors.
There is no evidence of anything, Google is not easy to figure out no matter how much research and trial and error you do. The key is to keep an open mind and think in such terms: "If I was google...what wouldn't I like about my site?"
Any penalty that Google hands out is a sign that there's "something" about the site or page they didn't like. That's essentially a tautology. The -950 is a particular species of penalty, and we do know a bit more detail about it.
1, It is an algorithmic penalty aimed at over-optimization. This is one of the few definitive things that Google reps have offered on the topic.
2. It's based on semantic calculations around keyword phrases
3. Here's where the "something" vagueness problem comes into play. Many areas of keyword usage are folded into the algo. According to the patents, this includes at a minimum: too many related phrases, or not enough. Also, both grammatical and html formatting markers - capitalization, bold, italic, underline, anchor text, quotation marks, heading tags, in the url, position in the page template (manu, content, footer) ... on and on and on the list goes.
4. The thresholds for catching this penalty seem to be recalculated periodically, but not continually. So continual tweaking is probably not a good idea - you can run afoul of another part of the total algo if you do that.
I'd suggest making your "best guess" de-optimization all at once, and then wait. The next time there's a threshold recalculation, that may set you free. If you know that you went way over the top, do a lot. If it seems your page is site is probably just an edge case, then de-optimize a little.
5. According to the patent at least, "spam tables" are created with relationship to the particular phrase area - and not absolutely for all sites and all queries. So even if your entire site seems to be penalized, it may still rank on queries that are not the kind of targeted search that you care about. You can sometimes see instances of this by mining your server logs.
6. The thresholds are calculated in a way that accomodates the "usual practices" in a specific market - the general "semantic neighborhood", or taxonomy, if you will. So just because you hear about a similar practice to yours in a different market niche, that doesn't mean Google will tolerate that level of aggressive optimization in your market.
-----
In my experience, these are particularly dangerous ideas:
A. Using ALL the terms in the "Related Searches" section of a SERP
B. Creating an overly large number of phrases that all build off a core set of keywords. Don't take keyword research data and use ALL of it on your site.
C. Using your target phrase in every grammatical and html format possibility, especially using many of those options more than once.
D. Calculating co-occurence data for your main keyowrds and using all those phrases.
-----
When it comes to off-site factors, one of the first of our members to see the -950 penalty fall away got released when a new and authentic link was pointed to the penalized page using the penalized anchor text!
In my view, it was the quality of the linking page that made the difference. So don't neglect to improve the content of the site, the substance and value that it offers, and its ability to attract new quality links.
I read up on googles guidelines and noticed something about "keeping the total links on a page to under 100" well my total navigation exceeded 100 links, so what I simply did was...
Well this penalty has been killing me for four months now and I have complied with every guideline but the <100 links--I had always assumed that this guideline meant external links because my site would be much less useful and less valuable without the internal navigation. So today I stripped all the nav links except the link to my site map, usefulness is also a function of "the quantity of users" anyway.
It occurs to me that now without all the internal nav links the best way to navigate my site is by using google, that has unpleasant overtones.
If there's any change I'll report it here.
potentialgeek, thank you for your replies, this <100 links thing seems like the most promising change at the moment and I don't like to do mre than one change at a time.
I wonder if this is a common misunderstanding, thinking that Google is only talking about external links? If anyone want a real eye-opener, they should check their stats for which links on the page actually draw the clicks.
It appears you have leapt to all kinds of mistaken assumptions about the 950 penalty, and then are refusing to understand what the penalty is.
Some pages are hit by the 950 penalty for good reasons that no one disputes. Nobody think a hacked/hijacked page on an edu domain doesn't deserve the penalty, not even the hacker. The penalty is applied rightly, sensibly and correctly often.
At the same time, the 950 penalty has high collateral damage. Pages on domains that Google respects and that Google has no desire of penalizing end up being penalized. This is mistaken penalization.
Obviously these are two very different circumstances. The penalty is sometimes applied sensibly, and sometimes applied stupidly.
...they should check their stats for which links on the page actually draw the clicks.
I do and I think that many of my regular visitors will be confused and put-off by the loss of my internal navigation. I am sick over losing this well-thought out system. I have built this site since 2002 and I regret having ever started it, Google has ruined this thing for me.
Yes, I know I could block Google don't anyone be so stupid as to suggest that.
[edited by: OnlyToday at 12:36 am (utc) on April 27, 2008]
sure 500 keyworded anchors per page may attract a penalty and i assume you dont mean that. But not real genuine navs.
Who has any real evidence that number of links on a page can attract a penalty?
Since I've tried virtually everything else over the past four months and am nearing the point of throwing in the towel trying <100 links per page makes sense. Perhaps in the coming days I will have the evidence. The penalty was large (90% of Google traffic) and abrupt when it happened, so if it is lifted I will notice. I have made no other changes in the past two weeks.
But not real genuine navs.
I'll let you know...
...more than 100 links on a page...
OnlyToday - Your comments about needing more than a 100 links on your home page for navigation suggests to me that you should think more about hierachical navigation structure [google.com].
At the risk of throwing this thread about the minus 950 penalty off topic, here are several threads that may get you thinking in the right direction...
Nofollow for a page with too many internal links?
[webmasterworld.com...]
I'm on the side of a good hierachical structure, with category pages that might attract some inbound links.
linking structure within a site
unconventional methods - do they work?
[webmasterworld.com...]
Website design and funneling PageRank
[webmasterworld.com...]
If this doesn't help, I recommend starting a new thread to discuss nav structure.
</offtopic>
[edited by: Robert_Charlton at 6:00 am (utc) on April 28, 2008]
that may get you thinking in the right direction...
Thanks for this Robert. If removing the internal links does indeed lift this penalty I will redesign the entire site, if not I'll redesign my life.
An interesting development after removing my internal links from the left column is that I am getting fewer page views because people can't surf my site but OTOH my click-through-rate has spiked very nicely presumably because they are exiting via the ads.
I read up on googles guidelines and noticed something about "keeping the total links on a page to under 100" well my total navigation exceeded 100 links, so what I simply did was break up my navigation into 4 main categories
I too overlooked the 100 links on a page by thinking it meant external links. Then I realized it made a lot of sense to have less than 100 internal links as well.
Lots of links usually means lots of anchor text - and that creates many chances to go over the phrase-spam threshold. A good information architecture with well chosen menu labels is most often much better for users as well.I wonder if this is a common misunderstanding, thinking that Google is only talking about external links? If anyone want a real eye-opener, they should check their stats for which links on the page actually draw the clicks.
it would be incredible to me if google penalised you for having more than 100 links on a page. It makes no sense. I do not believe this happens because if it did google may as well become website designers as they would be basically saying its our way or the highway. I thought they said consider 100 links for a sitemap?
nearing the point of throwing in the towel trying <100 links per page makes sense
And to make sure we are talking about the same thing, heres what google says:
...snip...
Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
...snip...
Keep the links on a given page to a reasonable number (fewer than 100).
Although i do not think that having more than 100 (internal and/or external) links directly gives you "google penalty points", it does make sense to me that if your targeted keywords appear several times in these, they could add up to look like spam or over optimization. This together with other issues google may dislike on any given page may well result in the "total penalty score" reaching the magic and feared threshold for the -950.
if your targeted keywords appear several times in these...
To be sure I was also skeptical that the <100 link rule included internal links and could trigger a penalty, which is why I waited four months to try this. But it is the very last of the guidelines violated, so I'm at the end of those remedies.
As I mentioned in an earlier post dropping the menu decreased the page views immediately and will probably result in the pages' failing to pass on page rank internally so the pathetic traffic will probably decrease even more for that reason. But after two weeks or so, if there is no lifting of the penalty I will return the menu. Over past experience I have found that it takes about two weeks for a majority of the changed pages to appear in Google's cache--that is the threshold I will wait for before concluding that dropping the menu did not work.
Anyway, on another topic, my secondary site that I've been building to counter its original -950 penalty continues to climb up the SERPs. Today it cracked the top 20 for the most competitive search phrase.
I'm finding it easier to build a site properly from scratch than to fix the primary -950d site. I'm torn between spending time trying to fix stuff which may not get fixed and de-950d even months later (and could still fall off the wagon again if it ever gets out), and building what I see is improving every week with steady traffic increases.
I think I'll just keep building according to Google Guidelines. There aren't any Google Get Out of 950 Jail Guidelines.
p/g
I have a 6 year old site that was hit with this penalty. I have tried everything in the last 6 months and nothing has worked to get me back. I have one main keyword phrase that has been number one through this entire thing. Another main phrase was hit by the penalty along with most of the rest of my site. I lost about 95% of my google traffic.
Anyway, I tried to unoptimize but that didn't do it. The last thing I did was to reduce the number of links (internal/external) down from 180 to around 95 on 6 of my main pages.
This has seemed to work and is holding. Im all for following the guidelines but never thought that one was very serious. Seems it is a big deal to Google.
[edited by: Robert_Charlton at 2:18 am (utc) on May 3, 2008]
[edit reason] removed specifics [/edit]
Google actually suggests it twice in the "Design and content guidelines"...
Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.......
Keep the links on a given page to a reasonable number (fewer than 100).
This is really about PageRank distribution, if you think about it.
Read my "<offtopic>" comments about 5 posts up about the kind of hierarchical nav structure you should think about.
I wanted to share my experience because I finally got some success and you all have been so helpful with this discussion.I have a 6 year old site that was hit with this penalty. I have tried everything in the last 6 months and nothing has worked to get me back. I have one main keyword phrase that has been number one through this entire thing. Another main phrase was hit by the penalty along with most of the rest of my site. I lost about 95% of my google traffic.
Anyway, I tried to unoptimize but that didn't do it. The last thing I did was to reduce the number of links (internal/external) down from 180 to around 95 on 6 of my main pages.
This has seemed to work and is holding. Im all for following the guidelines but never thought that one was very serious. Seems it is a big deal to Google.
I just wanted to ask, so I understand.
Did these links you remove, broadly have the same anchor text?
Is it possible that had you changed the achor text of these links, rather than removing them, the same result would have occured?
Mark
[edited by: Robert_Charlton at 8:19 pm (utc) on May 3, 2008]
[edit reason] added quote box [/edit]
I think it is very closely related to the 950 penalty.
The grey bar phenomenon
[webmasterworld.com...]
cheers
Stgeorge
Did these links you remove, broadly have the same anchor text?
Is it possible that had you changed the achor text of these links, rather than removing them, the same result would have occured?Mark
No, they didn't have the same anchor text at all. Everything on my site fit into the subject matter and 95% of the links are internal links. I do not link out to bad neighborhoods. I am just convinced Google is looking for a few triggers and to many links is one of them. I might have had 2 or 3 triggers which caused the penalty. I am back on page one now with my most important phrase too.
Let's see - I'm assuming there's now about 5,000 posts for this over the past couple of years.....and no one has decided or proven anything.
The guys at the 'plex' have to be rolling in the floor over this - they couldn't have dreamed up a better way to make people chase their own tails...