homepage Welcome to WebmasterWorld Guest from 54.161.214.221
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 [5] 6 7 > >     
Google's 950 Penalty - Part 5
steveb




msg:3250638
 11:44 pm on Feb 12, 2007 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

"That's exactly the sort of sites I'm referring to"

Unfortunately some comments about this issue apparently can't be bothered to actually, horrors, look at the serps. Authority has a specific meaning with Google, and its plain that authority sites are what are commonly, mistakenly hit by this penalty. I don't think this is a good summary of the effect, but one simplistic way to look at would be to say authority sites with volume of quality in-links are being confused with spam sites with a volume of rotten quality in-links, sometimes.

One of the most interesting phenomenons is how an authority site can be #1 for red topic, blue topic, green topic and purple topic, but be 950 for orange topic, even though the linking, page structure and keyword usage is basically the same for all of them. Clearly a a ranking mistake is being made (either the 950 result, or all those #1's).

[edited by: tedster at 9:17 pm (utc) on Feb. 27, 2008]

 

annej




msg:3270781
 6:00 pm on Mar 4, 2007 (gmt 0)

MHes

If your home page is hit, others can rank normally.

Are you saying that even though the homepage has been affected it's PR is still there so that deep pages are not affected?

Predictive phrases as per the patient are a factor.

Yes, but how extensive is the predictive factor? It would predict the common words and phrases. But would it predict phrases that are rarely seen online as in my war and hobby example in my last message?

All sites are treated the same.

I think you may be right there. My site is 11 years old and I suspect it is viewed as an authority given the gov and edu links plus many more related links. But I think only the homepage is protected. It is the one with all the links, the individual pages have very few. It's the links that give protection not any authority or age designation.

anax

So, is a basic recommendation for average webmasters like me to reduce the number of times a phrase might be repeated on a page? (Even if I think it's legitimate?) And would this apply to the meta element and page title also?

First, I wouldn't change anything unless the page is affected. It's not the number of times any phrase is used. It only becomes a problem if there are phrases that are designated as possible spam phrases.

If the page has dropped way down in the serps it won't hurt to reduce repetition as long as the page and internal links still make sense.

MHes




msg:3270787
 6:06 pm on Mar 4, 2007 (gmt 0)

>reduce the number of times a phrase might be repeated on a page?

No. A 'phrase' is only counted once.

wanderingmind




msg:3270794
 6:13 pm on Mar 4, 2007 (gmt 0)

How is one supposed to tweak the internal anchor text of related stories for a page on a news website?! This is possible for a corporate site probably, but if I go around checking if every internal link to another story from a page to avoid excessive keyword repetition by accident, my earnings will soon end up with the shrink!

Google or not, we have to link by our headlines as it is natural for a news website such as mine. And I suspect this attitude is costing me already - I see new pages linked from the homepage going supplemental, while old stories linked from sections remaining steady in rankings.

Is this in any way time-limited? Like, older pages are unaffected?

God help me.

MHes




msg:3270802
 6:17 pm on Mar 4, 2007 (gmt 0)

>Are you saying that even though the homepage has been affected it's PR is still there so that deep pages are not affected?
Yes. If the homepage is dumped it's 'local rank' value is lost but pr is a seperate issue.

>Yes, but how extensive is the predictive factor? It would predict the common words and phrases. But would it predict phrases that are rarely seen online as in my war and hobby example in my last message?
Adwords is a good source of potential phrases and predictive phrases.

> It's the links that give protection not any authority or age designation.
A page can be redeemed if links help it. Think of the penalty as being a page put on one side. If they then see redeeming factors, a condemned page can be redeemed. If not, it is dumped. Hence pages can rank or be dumped, according to the search.

anax




msg:3270865
 7:09 pm on Mar 4, 2007 (gmt 0)

> >reduce the number of times a phrase might be repeated on a page?

> No. A 'phrase' is only counted once.

I'm sorry, this stuff is making my head spin. Independent of the analysis, could someone post a two or three item list of recommendations for people hit by this penalty? Just the bare proposals ("reduce keyword density", "eliminate meta descriptions"), and then with that we can go back and try to understand the theory behind it.

MHes




msg:3270874
 7:43 pm on Mar 4, 2007 (gmt 0)

>could someone post a two or three item list of recommendations for people hit by this penalty?

1) Organise your navigation to focus on a theme.
2) Organise your content to focus on a theme.
3) Organise so a dimwit can navigate your site.... keep it simple and relevant.

anax




msg:3270889
 8:00 pm on Mar 4, 2007 (gmt 0)

Yes, but the problem is those are old recommendations that I thought I had been following all along in the interest of my users. What does the average webmaster whose traffic was destroyed need to do *differently* now? (I'm trying to understand.)

CainIV




msg:3270936
 9:37 pm on Mar 4, 2007 (gmt 0)

This is a very interesting topic. I have one website that just got hit by this it would seem.

I have a few questions for a few of you including tedster, MHes, randle, annej you have spent some time looking at this penalty.

I believe as you do that much of this has to do with internal linking as well as phrase based detection.

I have one website that tanked last year, and was top ten in Google. I had a sitewide template that had navigational links to the main categories.

Every page in the website linked from the left side nav to these main categories.

In terms of a sitewide hierarchy, I am looking into suggestions on what has worked best.

I do not use extensive keyword linking in my sitewide navigation, only the barebones.

Does it make sense to:

Index - list all main second level categories
Second Levels - also lists all second level categories like the index

Individual article pages - breadcrumbes only to main category article belongs in and home.

This makes sense from a visitor standpoint, so I dont see how it could be an issue, any thoughts or feedback?

steveb




msg:3270948
 9:58 pm on Mar 4, 2007 (gmt 0)

"Independent of the analysis, could someone post a two or three item list of recommendations for people hit by this penalty?"

Build your sites with unique content and good site construction, in accordance with user-friendliness, and Google's webmaster guidelines.
Expect sometimes Google makes bad judgments in ranking.
Avoid adding new pages below the level of a penalized page; put them somewhere else.

madmatt69




msg:3270987
 10:55 pm on Mar 4, 2007 (gmt 0)

Like clockwork, I'm back again today. It definitely seems to be a two week on, two week off cycle. My homepage was cached today as well.

Undead Hunter




msg:3271075
 12:56 am on Mar 5, 2007 (gmt 0)

IF people like Matt's sites are coming back after an approx. two week period, then maybe all the ideas in this thread are what is going to ultimately PROTECT a site like Matt's in the future? Maybe Google is intentionally knocking down these sites in order to see what rises to the surface?

This is one reason I'd be concerned about making changes at this point.

It seems to most of us that our layouts are as logical as you'd suspect. I mean, as far back as '98 Jakob Neilsen was telling people to name things plainly, that *searchers* need to see key phrases immediately to connect to it. I'm not sure the idea of changing that is such a good idea.

Any thoughts?

Biggus_D




msg:3271087
 1:11 am on Mar 5, 2007 (gmt 0)

Not always 2 weeks, I've been almost a month in hell. And just 3 days in January.

annej




msg:3271101
 1:44 am on Mar 5, 2007 (gmt 0)

MHes
>reduce the number of times a phrase might be repeated on a page?
No. A 'phrase' is only counted once.

We may not be talking about the same thing. I'm basing my multiple phrase theory on the phrase based detecting spam documents patent. It states that they are looking for pages where "the actual number of related phrases present in the document significantly exceeds the expected number of related phrases".

I assumed this meant multiple incidences of certain phrases. Also I understand a phrase can be one word. Could you explain more how a phrase is counted just once? Does the above mean too many occurrences of different related phrases?

3) Organise so a dimwit can navigate your site.... keep it simple and relevant.

I ran into a couple of problems doing this. I had left hand navigation interlinking the articles in a given section. The links just had one or two words in them, the most descriptive for each article. Apparently some were problem phrases.

Also on contents pages I linked each article with the title of the article and it appeared to help to change the links so they aren't the same as the title.

wandering mind
Is this in any way time-limited? Like, older pages are unaffected?

I'm quite certain there is no protection in age. I lost one page that has been online since 98.

>could someone post a two or three item list of recommendations for people hit by this penalty?

What people have suggested here makes sense if you want it in a few suggestions. If you want to do more it's not so simple. You really have to read through all the 950 theads and other related threads then decide what you want to try. It's a trial and error thing.

madmatt69




msg:3271104
 1:49 am on Mar 5, 2007 (gmt 0)

BiggusD - That's what makes this so hard to figure out. Mine definitely seems to be on a two week cycle since January, with a few odd bumps. But I'm sure everyone else's sites are different.

It's annoying too because I can see on my site when someone uses my little "Refer this site to a friend" link - which gets a lot of usage when my google traffic is up. Which to me means I'm doing something right..but of course that doesn't get factored into google's algo.

I've done about all I can on my site - definitely found some old pages that shouldn't have been there, fixed a few nav issues, and made my code much more standards compliant. Some new links too.

But I don't think any of that is influencing this two week cycle I seem to be on. It seems to me google is still testing something to do with phrases, and any changes we make to our sites now probably won't make too large a difference until they settle down on this algo.

Biggus_D




msg:3271122
 2:17 am on Mar 5, 2007 (gmt 0)

I've been reading more posts and someone (I don't remember the name) said that last summer was hit for 3 months, recovered at the end of September... until January.

So I have a question. Historically speaking, those weird Google issues get solved eventually or people get tired of complaining and then get another job?

Because you can't run a business with those huge random drops.

Undead Hunter




msg:3271127
 2:31 am on Mar 5, 2007 (gmt 0)

Biggus:

Personally, Jan. was the first time since Bourbon a year and a half ago (?) that we've been hit. We've had some high and low periods that seemed strange, but nothing drastic either way.

We're down to between 1/4 and 1/3rd of revenue. But that's enough to sustain us (barely). I plan to start new sites in this period, and hope for the best. I assume it will work itself out, but if it doesn't I figure we'll eventually learn what we did "wrong" and work around that in the future.

Giving up and going to a "job" is never an option, not on the table at all. Even if we were bust next week it wouldn't be an option.

madmatt69




msg:3271141
 2:33 am on Mar 5, 2007 (gmt 0)

Yeah. Well I mean I came to the realization early on to plan for at least a few months of the year being lost due to random algo changes. But nothing was ever as severe or random as whats going on now.

It definitely has me wondering if I want to stay in the business! But I'll stick it out :)

Biggus_D




msg:3271158
 3:31 am on Mar 5, 2007 (gmt 0)

The problem is that some business can not hire/fire people as often. And
sometimes you just can fire yourself.

Anyway, I was checking Analytics and since the drop is like someone drew a line saying you'll never pass this mark.

It's really curious because I have never seen such perfect results, so consistent, so nice that seems made on purpose.

macman23




msg:3271194
 4:48 am on Mar 5, 2007 (gmt 0)

1) Organise your navigation to focus on a theme.
2) Organise your content to focus on a theme.
3) Organise so a dimwit can navigate your site.... keep it simple and relevant.

1) That's how I originally designed my site
2) The theme of my site couldn't be more obvious.
3) I have designed my site to be as intuitive and as easy to navigate as possible.

My website still ranks at the bottom of the last page of search results. It used to rank at number 5. This penalty has been hurting my traffic substantially for several months. Any other ideas?

anax




msg:3271204
 5:18 am on Mar 5, 2007 (gmt 0)

I'm really in the same condition as macman in the previous post, and I was hit in the first round of this chaos, the October Massacre [webmasterworld.com].

There really isn't any clear information after 4 months. For lack of any other possibilities, today I deleted my meta descriptions, removed a bunch of internal keyword links, deleted a few occurrences of various keywords, changed the titles of various links to read (e.g.) "Page up" instead of "Blue widgets," etc.

Does this sound stupid? Yes. Has Google given me any options? No. They killed 80% of my traffic; it would be hard to do anything that would make it worse. Shooting in the dark is all that's left.

steveb




msg:3271217
 5:43 am on Mar 5, 2007 (gmt 0)

That's about the fifth round, not the first.

trakkerguy




msg:3271219
 5:47 am on Mar 5, 2007 (gmt 0)

Anax/MacMan23 -

Any way you can stay away from the "problem" words in the phrases on your page, and in particular in the internal links, should help. Instead of "page up" - if you know "widgets" is the problem, can you just make the anchor text "blue"? That has worked for me.

Anax - The site I am working with was hit mid October also, then worse on Dec7 and Dec20. Any correlation on those dates for you?

MHes




msg:3271250
 7:04 am on Mar 5, 2007 (gmt 0)

macman23, Anax and annej

If you had 10,000 links pointing to a page with the anchor "cheap widgets" and the page had the phrase 10,000 times on it and no related phrases..... you would not be hit by 950.

As an aside, you may not rank well for other reasons, but you would not be hit by this filter.

CainIV




msg:3271254
 7:17 am on Mar 5, 2007 (gmt 0)

As an aside, you may not rank well for other reasons, but you would not be hit by this filter.

Please elaborate as to why.

pxc433




msg:3271486
 2:16 pm on Mar 5, 2007 (gmt 0)

I've been hit with this as well, but not on all datacentres. Does this mean we *will* be hit everywhere soon? I first noticed it 2 days ago.

If you search for 'keyword1 keyword2' (one of our key phrases) we are nowhere on some data centres.

If you search for 'keyword2 keyword1' then we rank where we used to rank for 'keyword1 keyword2'.

I've read and re-read this thread a few times. What I'd ask is for those who've 'escaped' to post and summarise what they did, saying how long they've been out of the penalty.

nuevojefe




msg:3271533
 3:11 pm on Mar 5, 2007 (gmt 0)

Please elaborate as to why.

Because this is triggered when you have an excessive amount of related phrases.

If "flipping widgets" has 100 phrases that Google considers related, there is a certain number of those phrases which can be present on the page before this filter can be implemented. It seems to depend on a number of factors though; it's not just a number like 41 related phrases is ok, 42 and above is not for example.

To understand the filter you have to understand what they are trying to combat. Previously there was a number of prevalent spammers abusing google's lsi/lsa based ranking criteria quite well; especially in conjunction with trusted domain parasitic SEO.

Even so, it seems this filter is really going awry in many cases. I find it surprising that it isn't effective enough turned down a few notches.

thedigitalauthor




msg:3271539
 3:13 pm on Mar 5, 2007 (gmt 0)

Some of my pages in supplemental for key word/terms. One of the pages I have tracked b/c it has been in the top 3 of the G Serp almost since inception has gone back and forth in the Serps with the 950 penalty. Today, I went "looking" for the page, not being in the top 3, I seached elsewher. Could not be found in the top 1000 results. With trepedation, I clicked on the "repeat the search with the omitted results included." links and of course, my page was in the #2 spot. This was an original content page that I had inside knowlege on, so because of its position and content, it had a good # of inbound links from other related sites [although today, only internal links are showing up on "info:http://www.sitename.com"]

This is not true of the few other pages/terms I have checked. Is this related to the penalties we are discussing, or just another separate issue I need to deal with? [As I have read, once a page/term is in the supplementals, you are in there for the long haul].

MHes




msg:3271552
 3:31 pm on Mar 5, 2007 (gmt 0)

>Is this related to the penalties we are discussing, or just another separate issue I need to deal with?

I would guess it is a different issue. One symptom was to be at the end of normal results or end of filtered results. However, google changes things all the time so it is never a clear picture. The "repeat the search with the omitted results included." adds &filter=0 onto the search string, so you are being filtered out when the normal search is done. I would go through all the factors concerning duplicate content and good relevant links in first.

annej




msg:3271643
 4:56 pm on Mar 5, 2007 (gmt 0)

Why would you delete meta descriptions? If they are individually written and not key word stuffed there should be no problem. I agree if they are all duplicates they should go. But a well written meta description usually shows up on the search result page and can help searchers choose your page over others.

qwerty




msg:3271657
 5:09 pm on Mar 5, 2007 (gmt 0)

I'm a hobbyist webmaster who has been hit by the 950 penalty.

Can anyone give some guidance on how to identify a "problematic keyphrase"?

I've looked at at other sites at the bottom trying to find a pattern of similiar keyphrases to no avail. My site is not in a competitive field, but it is a 7 year old site.

Until I can figure out my problem phrases, I figure the only thing I can do is to get more back links.

trakkerguy




msg:3271728
 6:19 pm on Mar 5, 2007 (gmt 0)

qwerty - As nuevojefe says - "To understand the filter you have to understand what they are trying to combat". Problem phrases are likely to be those that are likely to be abused by spammers. If a phrase is not searched for often, there is little incentive for spammers to flood the serps, and for Google to apply these filters.

Don't know how it holds for others, but from what I've seen problem phrases are searched for enough that they show up when looking on GoogleTrends.

This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 [5] 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved