Forum Moderators: Robert Charlton & goodroi
I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.
I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.
I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.
The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.
e.g:
Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9
But for each of the directories, i.e:
http://www.example.com/keyword1/
there is still repetition of the horizontal header nav link in the vertical menu:
e.g:
Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9
Keyword1 Widgets
Red
White
Blue
...
I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"
Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!
That's just bad site structuring.
I HATE THIS 950 POS!
I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.
"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]
"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster
"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster
So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.
Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...
Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?
Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.
That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."
I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.
Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?
p/g
"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g
[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]
Domain: www.example.com / PR5 / Lots of authority links from sites like nytimes, cnn, cnet, etc.
The page that I expect to rank is the main index (www.example.com/). When doing a google search for "example", I am no where to be found (not listed at all in the first 1000 results, even with omitted results included). A search for "example 2008" and the site is the first result. Also, other pages on the domain rank as expected, just the main index page is not showing up in the results for the domain name (which is also the keyword). A search for "example.com" brings the domain as the # 1 result as expected.
The content of the site has not changed since January of 2008 (it's a static/seasonal site). Does this sound like a 950 penalty or is it something else? The site got hit last year around this time, except a search for "example" made the domain show up as the # 10 result instead of # 1 and all of the other pages in the domain where way back in the SERPs. This is basically the opposite (except the main index isn't found anywhere), so I am just wondering if it's the same penalty or what.
Edit: Also, a search for a unique sentence on the main page (which does not contain the keyword at all) does not return the main page (www.example.com/).
[0223] If the document is included in the SPAM_TABLE, then the document's relevance score is down weighted by predetermined factor... Alternatively, the document can simply be removed from the result set entirely.[url=http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=1&p=1&f=G&l=50&d=PG01&S1=20060294155.PGNR.&OS=dn/20060294155&RS=DN/20060294155]Formal Patent on the USPTO website
On December 22 my Google traffic was reduced by 80% and I began trying everything possible, including many suggestions found on this forum, to right the situation. On May 16th I gave up and returned the site to its pre December 22 state.
Two hours ago my Google traffic quintupled VERY ABRUPTLY and my traffic has now returned to pre-Dec. 22 levels.
Go figure.
edit corrected spelling error
[edited by: OnlyToday at 7:18 pm (utc) on May 28, 2008]
I've broken free of the 950 Penalty today on my main site for the first time since last year! Hopefully the new SERPs will hold--I realize the 950 beast has been good for other sites for a short time and then slipped.
The good news is I'm seeing a return to essentially the same SERPs the site had before the 950 Penalty across several different search phrases, including a prime target.
Late this afternoon I saw traffic surging and it's continued through the rest of the day. Google visitors just doubled.
I'm cautiously optimistic, but it's the first breakthrough of any kind for months, and it came after "deoptimization." I deoptimized in various ways as already noted in this thread (sometimes one round of changes per month).
The latest round was to remove duplicate anchor text and add on-page text to pages which had a very high link:text ratio. Too many links + not enough text = a problem.
In some instances I converted anchor text into text. In other instances I replaced anchor text with thumbnails completely free of alt tags, so the links are "ultraconservative," i.e., zero effort to get ranking points from anchor text.
I set up shortcuts in Dreamweaver so I could go through many pages quickly and strip the link off text in one step per link (Command-R). So it didn't take too much time actually to fix the entire site.
Fingers crossed,
p/g
This is exactly what I did one year ago and broke free of the filter altogether, and never looked back.
Not to say that completely innocent websites cannot get caught up in this, but in general, I have found that most websites can be diagnosed and freed from the grip of this when elements pertaining any given filtered page are addressed.
For what it is worth, once my pages broke free and were ranking again, I slowly built VERY high quality links to those pages, within a few weeks.
Google used to like very short pages but the 950 penalty revised its position at least in certain circumstances (e.g., thin pages AND many/mostly links).
So I'd suggest consideration to others about short pages and whether they should be deleted or the content merged into other pages. Do your short pages really need to be separate from other pages?
p/g
In terms of innocence. Sometimes what we do naturally turns out to be optimization. I'd added links from each page in a catagory section to all of the others on the topic. I was trying to improve time on site and SEO didn't even occur to me. But I think the -950 filter looked at the repeated links as over optimization.
Elsewhere I had a separate index for each topic, so it was like a directory. There was no need to have much text beside each link to the pages in the directory, so it was 99% anchor text.
So I just moved those links on pages with no content into the pages with content. I put them in the right-hand column. It accomplishes the same purpose and doesn't appear to upset Google. It may look neater and more organized to have a simple directory page with no content but it's really not necessary. Actually I think it's common practice to use the LHS for sitewide links and the RHS for directorywide links.
I'm sure you could have directory link pages as long as they have content. In fact for some directory indices, I'm keeping the intro page but adding content. They have PR3 on average and they're fairly old, so I don't really want to delete entirely.
My site still hasn't recovered fully for the most competitive keywords/phrases. In most cases, in fact, the most competitive phrases still have a long way to go. I'm doing really well with long-tail phrases again, though. Which makes you wonder how much the 950 penalty is based on competitive phrase targeting/spam.
Also how long after changes have people seen movement in their sites? I cannot remember how long it took last time
It may take a cache update. Or perhaps less. It can depend on how much PR you have or how often Google respiders your page/site. Usually less than four weeks, it seems. It may now be a bit slower, though, because some webmasters report Google spiders less often lately.
Say you have a site with 1000 pages and many of them need to be fixed to get out from under the 950 penalty. Google could take its time and finish respidering them after 2 or 3 weeks. If there was that kind of a penalty on a site, better SERPs won't show until it's mostly/fully respidered.
Some sites, though, may only have a small number of offending pages, in which case a few days would restore SERPs. Some webmasters here got back to normal after only a very short time (days, not weeks).
p/g
The section I added has a somewhat funny structure, I would have a item type page (i.e. Widget) and that page would link to the various individual items under it (i.e. Red square widget, red round widget, red triangle widget). Problem was, Red widget came in 5 different shapes, and because I wanted to concentrate on writing description and getting pictures for a single page vs. 5 pages with one color each but the same product, I listed all 5 items on the "Red widget" page, each has a picture and a text link under it, but all 5 linked to the same page. (there are many other items on the page, but often 2-3 different pictures and anchor text linked to the same page as well)
I added the new section about 2 weeks ago, and got hit with the 950 today.
This is likely where my problem is from reading the stuff in here. But, I am not about to make a page for every color of an item, because then when people link to me, they could be linking to one of 5 pages, instead of just that one page, which would be better.
What I just did now, is that I still put in multiple links to the same page, but after the first link, I nofollowed the rest. So in the example above, link to the first red widget has a regular link, but link 2-5 to the red widget page had nofollow.
I will report back to let you know if this works.
I'm convinced that the -950 is realted to the phrase-based spam detection patent. Note that according to the patent, the penalized page may either go way down in ranking, or vanish altogether.
What are the top scenarios that you see this being applied to from a practical viewpoint?
Internal Links ?
IBL Links?
Overuse of keywords ?
Do you think that Google applies this by degree's ie - 30 , - 60 and the knock out -950 & so on ?
If you know a bit about SEO, too big a menu means you can easily cram in too many keyword variations - you've got a lot of anchor text to fill in on every page. The fix I see is to re-think your information architecture and create more concise navigation, rather than long laundry lists with anchor text that repeats and repeats and repeats important keywords. Concise menus are also much more user friendly.
In earlier years, working with search engines was like working with someone who was hard of hearing. You had to raise your voice and repeat yourself over and over to make your point - "THIS IS WHAT MY WESITE IS ABOUT." Those same habits today will get you smacked, where they used to get rewarded!
I'd say the -30, -60, or whatever number your see, are a different penalty mechanism. It's much more related to trust, backlinks and guidelines violations, rather than "over optimization" on the page.
I've been -950 free for some time and it did seem to be related to anchor text both how much and specific flagged words.
It does help to read those phrase based patents no matter how torturous. It helped me pinpoint some problems.
This could be in part at least the result of developing content slowly and naturally on the thinnest pages. Some of my best ranking pages ever were developed and interlinked slowly. Most of my sites that don't do so well were made in rushed fashion. I don't feel one can cut corners in the SE game so much nowadays.
p/g
About 7 months ago or so, I did some testing on a website of mine (that was ranking fine, top 10 Google primary two phrase keyword) The website had medium competition, but was not setup to monetize.
What I did was I excessively linked between pages from content only. In this case, each page was 500 words of unique content. The menu was in fact very small, and not keyword rich by any means.
I used full and exact keywords to link between the pages of the website (a total of about 60 pages). I linked from many of the pages to the root, with keyword, and linked about 15 times between each article.
I never tripped any filters at all, and after losing some positions (3-5) for a week or two, all keywords moved up in rankings significantly, even with excessive content links (in all search engines)
Although this does not tell us the exact nature of the penalty, it certain may tell us a lot about what the penalty is NOT about.
In fact, my next test will be all about using the same website "to overbloat" the navigation with excessive closely related phrases (red widgets, blue widgets, fuzzy widgets) to test th3e theory.
I am almost to the point of removing the keyword from the links - although this to me would make it harder for users to understand the nav (which is 18 links) they are 40 pages on the site in total (all content but 2 forms)
I am almost to the point of removing the keyword from the links
I had to do that for keywords that seemed to be triggering the filter. I put in synonyms but they were awkward. Several months later I was able to return the original keywords without losing the pages again. So the algo must have changed or maybe my overall site changed in a way that helped.
In another case just one good link to the page cured the problem. You just have to try a scattering of things.
I'm convinced that the -950 is realted to the phrase-based spam detection patent. Note that according to the patent, the penalized page may either go way down in ranking, or vanish altogether.
Could this also be applied to the meta titles and descriptions?
Could it be part of the duplicate content filtering which causes folks sites to disappear.
I'm just thinking that maybe Google could be applying this by degrees according to the percentage of duplicate content they see. Hence the different mechanisms you refer to [ aka -40 , -60 etc ]. I'm sure there's other factors, but i thought this could be one.
What do you think?
arubicus - 9:13 pm on Feb 27, 2008 #:3586456 5. Enough has been said re meta description tag where everyone knows to make it unique. I often wrote the tag to be about 240 long instead of 152, but wrote the first 152 unique for the snippet in the search engines. I then wrote more unique text after that hoping Google would see it (but would obviously not show it)
I'm seeing this type of thing repeated in several posts here where successful release from the filter has been achieved.
Maybe g1smd will have an opinion as the uncoverer of the duplicate content filter a while back.
[edited by: Whitey at 10:37 am (utc) on June 26, 2008]
My logic for this, By removing just one instance of keyword rich link repeated in the nav of all pages has de-op all pages enough thus resulting in a reduction of the phrase-based-link spam detection filter.
I don't expect target page to return for a week at least as other de-ops took place, inc a title change.
This site is new and clean, so it should be a good test.
My logic for this, By removing just one instance of keyword rich link repeated in the nav of all pages has de-op all pages enough thus resulting in a reduction of the phrase-based-link spam detection filter.
Are your meta titles and descriptions completely different or do you have a lot of similar characters in the title and description snippet?
Were the navigation links that you have removed pointing to the same terms in the meta title ?
And was this then repeated throughout the site as a larger proportion of the characters in the meta title ?
I'm still thinking duplicate content.
# The nav links do match some of the words of the meta title of the destination page -
eg "large red widgets link" goes to page:-
page title: information about large red widgets from widgets r us
# there is no repeated blocks of text within the site (except the nav), although the wording is all based on a specific theme, so you would fine a certain number of words - for example "large and widgets" as this is the theme of the website, it would in most titles eg - how to make extra large dark red widgets
(although again, all the content would be hand written and specific about this element of the site, the smallest page is 600 words of unique text.