homepage Welcome to WebmasterWorld Guest from 54.196.168.78
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 212 message thread spans 8 pages: < < 212 ( 1 2 3 4 5 6 [7] 8 > >     
Google's 950 Penalty - Part 13
potentialgeek




msg:3570326
 11:01 am on Feb 9, 2008 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.

I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.

I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.

The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.

e.g:

Keyword1 Keyword2 Keyword3 . . . Keyword9

But for each of the directories, i.e:

http://www.example.com/keyword1/

there is still repetition of the horizontal header nav link in the vertical menu:

e.g:

Keyword1 Keyword2 Keyword3 . . . Keyword9

Keyword1 Widgets
Red
White
Blue
...

I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"

Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!

That's just bad site structuring.

I HATE THIS 950 POS!

I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.

"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]

"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster

"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster

So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.

Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...

Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?

Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.

That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."

I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.

Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?

p/g

"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g

[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]

 

errorsamac




msg:3657331
 12:51 pm on May 23, 2008 (gmt 0)

It's been a little while since I've posted in the 950 thread, but I just noticed one of the larger sites I run get hit and I would just like to clarify that this is a 950 issue and not a different penalty.

Domain: www.example.com / PR5 / Lots of authority links from sites like nytimes, cnn, cnet, etc.

The page that I expect to rank is the main index (www.example.com/). When doing a google search for "example", I am no where to be found (not listed at all in the first 1000 results, even with omitted results included). A search for "example 2008" and the site is the first result. Also, other pages on the domain rank as expected, just the main index page is not showing up in the results for the domain name (which is also the keyword). A search for "example.com" brings the domain as the # 1 result as expected.

The content of the site has not changed since January of 2008 (it's a static/seasonal site). Does this sound like a 950 penalty or is it something else? The site got hit last year around this time, except a search for "example" made the domain show up as the # 10 result instead of # 1 and all of the other pages in the domain where way back in the SERPs. This is basically the opposite (except the main index isn't found anywhere), so I am just wondering if it's the same penalty or what.

Edit: Also, a search for a unique sentence on the main page (which does not contain the keyword at all) does not return the main page (www.example.com/).

JoeSinkwitz




msg:3657403
 2:19 pm on May 23, 2008 (gmt 0)

This is something else errorsamac; with the EOS re-rank problem you'd show within the last few pages of those 1000 results for "example".

tedster




msg:3657740
 8:19 pm on May 23, 2008 (gmt 0)

I'm convinced that the -950 is realted to the phrase-based spam detection patent [webmasterworld.com]. Note that according to the patent, the penalized page may either go way down in ranking, or vanish altogether.

[0223] If the document is included in the SPAM_TABLE, then the document's relevance score is down weighted by predetermined factor... Alternatively, the document can simply be removed from the result set entirely.

[url=http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=1&p=1&f=G&l=50&d=PG01&S1=20060294155.PGNR.&OS=dn/20060294155&RS=DN/20060294155]Formal Patent on the USPTO website


OnlyToday




msg:3661201
 7:17 pm on May 28, 2008 (gmt 0)

Here's the most bewildering thing that has ever happened to me with Google...

On December 22 my Google traffic was reduced by 80% and I began trying everything possible, including many suggestions found on this forum, to right the situation. On May 16th I gave up and returned the site to its pre December 22 state.

Two hours ago my Google traffic quintupled VERY ABRUPTLY and my traffic has now returned to pre-Dec. 22 levels.

Go figure.

edit corrected spelling error

[edited by: OnlyToday at 7:18 pm (utc) on May 28, 2008]

potentialgeek




msg:3667309
 4:24 am on Jun 5, 2008 (gmt 0)

***BREAKING NEWS***

I've broken free of the 950 Penalty today on my main site for the first time since last year! Hopefully the new SERPs will hold--I realize the 950 beast has been good for other sites for a short time and then slipped.

The good news is I'm seeing a return to essentially the same SERPs the site had before the 950 Penalty across several different search phrases, including a prime target.

Late this afternoon I saw traffic surging and it's continued through the rest of the day. Google visitors just doubled.

I'm cautiously optimistic, but it's the first breakthrough of any kind for months, and it came after "deoptimization." I deoptimized in various ways as already noted in this thread (sometimes one round of changes per month).

The latest round was to remove duplicate anchor text and add on-page text to pages which had a very high link:text ratio. Too many links + not enough text = a problem.

In some instances I converted anchor text into text. In other instances I replaced anchor text with thumbnails completely free of alt tags, so the links are "ultraconservative," i.e., zero effort to get ranking points from anchor text.

I set up shortcuts in Dreamweaver so I could go through many pages quickly and strip the link off text in one step per link (Command-R). So it didn't take too much time actually to fix the entire site.

Fingers crossed,

p/g

annej




msg:3667382
 7:47 am on Jun 5, 2008 (gmt 0)

Are people still seeing the -950 filter on just one or a few individual pages on a site or is it mostly site wide now?

SEOPTI




msg:3667560
 1:19 pm on Jun 5, 2008 (gmt 0)

potentialgeek, I got hit with all of my sites at the time you have been released. This tells me a major tweak at the Plex happened since some of my sites have never been affected by the -950 nonsense before.

Link - Text ratio sounds interesting.

[edited by: SEOPTI at 1:19 pm (utc) on June 5, 2008]

john28uk




msg:3667774
 5:33 pm on Jun 5, 2008 (gmt 0)

Cant believe it, after being out of the 950 penelty for over 12 months I am back in today! No changes made at my side so I am thinking this is a tweak, heres hoping its just a blip

SEOPTI




msg:3667801
 6:04 pm on Jun 5, 2008 (gmt 0)

I was also quite sure my sites will not get hit again, but after 12 months in the index this huge hit ... I don't know what to say ... you can never be sure what they tweak and should always be prepared to loose all of your rankings within minutes.

CainIV




msg:3667806
 6:07 pm on Jun 5, 2008 (gmt 0)

"I'm cautiously optimistic, but it's the first breakthrough of any kind for months, and it came after "deoptimization." I deoptimized in various ways as already noted in this thread (sometimes one round of changes per month)."

This is exactly what I did one year ago and broke free of the filter altogether, and never looked back.

Not to say that completely innocent websites cannot get caught up in this, but in general, I have found that most websites can be diagnosed and freed from the grip of this when elements pertaining any given filtered page are addressed.

For what it is worth, once my pages broke free and were ranking again, I slowly built VERY high quality links to those pages, within a few weeks.

potentialgeek




msg:3667973
 9:33 pm on Jun 5, 2008 (gmt 0)

One of the other changes I forgot to mention before getting back in Google's good graces (still back to normal today) is I deleted 100 pages. These were thin pages, mostly redundant, and mostly links, which didn't get very many internal links back to them. They were being used as introduction, directory-like pages.

Google used to like very short pages but the 950 penalty revised its position at least in certain circumstances (e.g., thin pages AND many/mostly links).

So I'd suggest consideration to others about short pages and whether they should be deleted or the content merged into other pages. Do your short pages really need to be separate from other pages?

p/g

annej




msg:3668009
 10:48 pm on Jun 5, 2008 (gmt 0)

If the short pages are useful to visitors couldn't they just be no indexed?

In terms of innocence. Sometimes what we do naturally turns out to be optimization. I'd added links from each page in a catagory section to all of the others on the topic. I was trying to improve time on site and SEO didn't even occur to me. But I think the -950 filter looked at the repeated links as over optimization.

potentialgeek




msg:3668191
 5:56 am on Jun 6, 2008 (gmt 0)

Yeah, Google isn't so merciful with repeated links now. It can't tell when you're trying to be helpful for visitors or spamming. I think my 950 problems may have started when I added repeat anchor text to related content. I wasn't paying attention and sometimes the related content links were identical. Even though they weren't right next to each other on the webpage, one can't really be surprised if Google thinks you're trying to spam.

Elsewhere I had a separate index for each topic, so it was like a directory. There was no need to have much text beside each link to the pages in the directory, so it was 99% anchor text.

So I just moved those links on pages with no content into the pages with content. I put them in the right-hand column. It accomplishes the same purpose and doesn't appear to upset Google. It may look neater and more organized to have a simple directory page with no content but it's really not necessary. Actually I think it's common practice to use the LHS for sitewide links and the RHS for directorywide links.

I'm sure you could have directory link pages as long as they have content. In fact for some directory indices, I'm keeping the intro page but adding content. They have PR3 on average and they're fairly old, so I don't really want to delete entirely.

My site still hasn't recovered fully for the most competitive keywords/phrases. In most cases, in fact, the most competitive phrases still have a long way to go. I'm doing really well with long-tail phrases again, though. Which makes you wonder how much the 950 penalty is based on competitive phrase targeting/spam.

john28uk




msg:3669328
 5:33 pm on Jun 7, 2008 (gmt 0)

Having read this and other 950 posts, I have made changes to my home page, all main sections of the site where linked twice, once from the header menu and once from a description in bold in the content, I have removed the bold links.

Also how long after changes have people seen movement in their sites? I cannot remember how long it took last time

potentialgeek




msg:3672792
 8:33 am on Jun 12, 2008 (gmt 0)

john28uk,

It may take a cache update. Or perhaps less. It can depend on how much PR you have or how often Google respiders your page/site. Usually less than four weeks, it seems. It may now be a bit slower, though, because some webmasters report Google spiders less often lately.

Say you have a site with 1000 pages and many of them need to be fixed to get out from under the 950 penalty. Google could take its time and finish respidering them after 2 or 3 weeks. If there was that kind of a penalty on a site, better SERPs won't show until it's mostly/fully respidered.

Some sites, though, may only have a small number of offending pages, in which case a few days would restore SERPs. Some webmasters here got back to normal after only a very short time (days, not weeks).

p/g

directwheels




msg:3681993
 12:42 am on Jun 24, 2008 (gmt 0)

I made a small site 2 years back, just a pet project about an electronic item. For the past 2 years, I have done all white hat stuff to it, and recently, I done a minor change by adding a new section and boom! 950'd.

The section I added has a somewhat funny structure, I would have a item type page (i.e. Widget) and that page would link to the various individual items under it (i.e. Red square widget, red round widget, red triangle widget). Problem was, Red widget came in 5 different shapes, and because I wanted to concentrate on writing description and getting pictures for a single page vs. 5 pages with one color each but the same product, I listed all 5 items on the "Red widget" page, each has a picture and a text link under it, but all 5 linked to the same page. (there are many other items on the page, but often 2-3 different pictures and anchor text linked to the same page as well)

I added the new section about 2 weeks ago, and got hit with the 950 today.

This is likely where my problem is from reading the stuff in here. But, I am not about to make a page for every color of an item, because then when people link to me, they could be linking to one of 5 pages, instead of just that one page, which would be better.

What I just did now, is that I still put in multiple links to the same page, but after the first link, I nofollowed the rest. So in the example above, link to the first red widget has a regular link, but link 2-5 to the red widget page had nofollow.

I will report back to let you know if this works.

Whitey




msg:3682032
 1:52 am on Jun 24, 2008 (gmt 0)

Tedster -
I'm convinced that the -950 is realted to the phrase-based spam detection patent. Note that according to the patent, the penalized page may either go way down in ranking, or vanish altogether.

What are the top scenarios that you see this being applied to from a practical viewpoint?

Internal Links ?
IBL Links?
Overuse of keywords ?

Do you think that Google applies this by degree's ie - 30 , - 60 and the knock out -950 & so on ?

tedster




msg:3682078
 3:06 am on Jun 24, 2008 (gmt 0)

I see the greatest liability for a -950 style penalty coming from having too many navigational/menu links in the site template.

If you know a bit about SEO, too big a menu means you can easily cram in too many keyword variations - you've got a lot of anchor text to fill in on every page. The fix I see is to re-think your information architecture and create more concise navigation, rather than long laundry lists with anchor text that repeats and repeats and repeats important keywords. Concise menus are also much more user friendly.

In earlier years, working with search engines was like working with someone who was hard of hearing. You had to raise your voice and repeat yourself over and over to make your point - "THIS IS WHAT MY WESITE IS ABOUT." Those same habits today will get you smacked, where they used to get rewarded!

I'd say the -30, -60, or whatever number your see, are a different penalty mechanism. It's much more related to trust, backlinks and guidelines violations, rather than "over optimization" on the page.

tedster




msg:3682090
 3:32 am on Jun 24, 2008 (gmt 0)

By the way, there are certainly other ways to trigger this over-opptimization penalty, but the mega-menu is the most common one I've seen. The other kinds of over-optimization are usually something you kind of know you did, but the mega-menu problem can be almost invisible to the webmaster. That's even more likely if a lot of the anchor text is hidden in a CSS menu system, so you never notice it all at one time.

annej




msg:3682104
 4:17 am on Jun 24, 2008 (gmt 0)

What I used to have is something like 10 to 15 pages in a subsection all linking to each other. (I was trying to increase the time people spent on site by linking to all related pages - silly me) But only a few pages were -950ed so I think it's a combination of the number of links with their anchor text and certain key words that the phased based thing has identified as being typical of spam. Certain words can be completely innocent in many contexts but they are related to spam and that seemed to hurt.

I've been -950 free for some time and it did seem to be related to anchor text both how much and specific flagged words.

It does help to read those phrase based patents no matter how torturous. It helped me pinpoint some problems.

potentialgeek




msg:3682126
 5:52 am on Jun 24, 2008 (gmt 0)

I saw an instant rebound on long tail search phrases around June 4/5, and I'm seeing a gradual rebound on the more competitive phrases.

This could be in part at least the result of developing content slowly and naturally on the thinnest pages. Some of my best ranking pages ever were developed and interlinked slowly. Most of my sites that don't do so well were made in rushed fashion. I don't feel one can cut corners in the SE game so much nowadays.

p/g

CainIV




msg:3682129
 5:55 am on Jun 24, 2008 (gmt 0)

I would agree Tedster, almost every example I have seen of a -950 website has had those same characteristics - over done, bloated keyword-rich menus. The telltale sign is when the menu uses full keywords to describe only properties or variations of a given product (flat, round, wide, red white blue) etc

About 7 months ago or so, I did some testing on a website of mine (that was ranking fine, top 10 Google primary two phrase keyword) The website had medium competition, but was not setup to monetize.

What I did was I excessively linked between pages from content only. In this case, each page was 500 words of unique content. The menu was in fact very small, and not keyword rich by any means.

I used full and exact keywords to link between the pages of the website (a total of about 60 pages). I linked from many of the pages to the root, with keyword, and linked about 15 times between each article.

I never tripped any filters at all, and after losing some positions (3-5) for a week or two, all keywords moved up in rankings significantly, even with excessive content links (in all search engines)

Although this does not tell us the exact nature of the penalty, it certain may tell us a lot about what the penalty is NOT about.

annej




msg:3682333
 1:05 pm on Jun 24, 2008 (gmt 0)

linked about 15 times between each article

I don't think it's just the excessive interlinking. That has to be combined with Google picking up something specific and I think that's where the phase based stuff comes in.

CainIV




msg:3682536
 4:42 pm on Jun 24, 2008 (gmt 0)

That was the point of the test, that when you interlink using closely related term phases in repetition in the menu, this phenomenon is most likely to occur, but do not seem to occur elsewhere, since the keyword links are never likely to repeat that many times.

In fact, my next test will be all about using the same website "to overbloat" the navigation with excessive closely related phrases (red widgets, blue widgets, fuzzy widgets) to test th3e theory.

soxos




msg:3682742
 8:55 pm on Jun 24, 2008 (gmt 0)

I also have a new site which seems -950ed some pages are 1st page (others are -950) now the pages that are -950 have exactly the same menu as the pages which are 1st page, the phrases are equally competitive. I have tried to de-optimise the offending pages as best I can - but they where very clean, unique content minumum had 650 words.

I am almost to the point of removing the keyword from the links - although this to me would make it harder for users to understand the nav (which is 18 links) they are 40 pages on the site in total (all content but 2 forms)

annej




msg:3683248
 4:40 pm on Jun 25, 2008 (gmt 0)

I am almost to the point of removing the keyword from the links

I had to do that for keywords that seemed to be triggering the filter. I put in synonyms but they were awkward. Several months later I was able to return the original keywords without losing the pages again. So the algo must have changed or maybe my overall site changed in a way that helped.

In another case just one good link to the page cured the problem. You just have to try a scattering of things.

Whitey




msg:3683916
 10:06 am on Jun 26, 2008 (gmt 0)

Tedster -

I'm convinced that the -950 is realted to the phrase-based spam detection patent. Note that according to the patent, the penalized page may either go way down in ranking, or vanish altogether.

Could this also be applied to the meta titles and descriptions?

Could it be part of the duplicate content filtering which causes folks sites to disappear.

I'm just thinking that maybe Google could be applying this by degrees according to the percentage of duplicate content they see. Hence the different mechanisms you refer to [ aka -40 , -60 etc ]. I'm sure there's other factors, but i thought this could be one.

What do you think?

arubicus - 9:13 pm on Feb 27, 2008 #:3586456 5. Enough has been said re meta description tag where everyone knows to make it unique. I often wrote the tag to be about 240 long instead of 152, but wrote the first 152 unique for the snippet in the search engines. I then wrote more unique text after that hoping Google would see it (but would obviously not show it)

I'm seeing this type of thing repeated in several posts here where successful release from the filter has been achieved.

Maybe g1smd will have an opinion as the uncoverer of the duplicate content filter a while back.

[edited by: Whitey at 10:37 am (utc) on June 26, 2008]

soxos




msg:3683929
 10:24 am on Jun 26, 2008 (gmt 0)

I have started removing keywords from nav links - 1 main page at a time, it has IMO made the awkward as mentioned above, but as this is a new site - hopefully as trust builds I will be able to revert to my "common sense" link text. Curiously, I changed the first link 2 days ago, and whilst that target page is still -950ed, many pages on the site have shifted up.

My logic for this, By removing just one instance of keyword rich link repeated in the nav of all pages has de-op all pages enough thus resulting in a reduction of the phrase-based-link spam detection filter.

I don't expect target page to return for a week at least as other de-ops took place, inc a title change.

This site is new and clean, so it should be a good test.

Whitey




msg:3683950
 10:51 am on Jun 26, 2008 (gmt 0)

My logic for this, By removing just one instance of keyword rich link repeated in the nav of all pages has de-op all pages enough thus resulting in a reduction of the phrase-based-link spam detection filter.

Are your meta titles and descriptions completely different or do you have a lot of similar characters in the title and description snippet?

Were the navigation links that you have removed pointing to the same terms in the meta title ?

And was this then repeated throughout the site as a larger proportion of the characters in the meta title ?

I'm still thinking duplicate content.

soxos




msg:3683984
 12:12 pm on Jun 26, 2008 (gmt 0)

# Meta tags / titles all completely different, all hand writen - perhaps a few words over lap pages as the site in general is quite indepth.

# The nav links do match some of the words of the meta title of the destination page -
eg "large red widgets link" goes to page:-
page title: information about large red widgets from widgets r us

# there is no repeated blocks of text within the site (except the nav), although the wording is all based on a specific theme, so you would fine a certain number of words - for example "large and widgets" as this is the theme of the website, it would in most titles eg - how to make extra large dark red widgets
(although again, all the content would be hand written and specific about this element of the site, the smallest page is 600 words of unique text.

idolw




msg:3684005
 12:52 pm on Jun 26, 2008 (gmt 0)

when i look at my sector all big guns excessively use internal linking with keywords. They have up to 200 links per page, each of them full of keywords.
They all rank fine and seem to be better than our sites with limited internal linking.

I think the -950 penalty is about content not links. are you all 100% sure the content is all right?

This 212 message thread spans 8 pages: < < 212 ( 1 2 3 4 5 6 [7] 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved