homepage Welcome to WebmasterWorld Guest from 107.21.187.131
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 212 message thread spans 8 pages: < < 212 ( 1 2 3 4 5 [6] 7 8 > >     
Google's 950 Penalty - Part 13
potentialgeek




msg:3570326
 11:01 am on Feb 9, 2008 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.

I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.

I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.

The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.

e.g:

Keyword1 Keyword2 Keyword3 . . . Keyword9

But for each of the directories, i.e:

http://www.example.com/keyword1/

there is still repetition of the horizontal header nav link in the vertical menu:

e.g:

Keyword1 Keyword2 Keyword3 . . . Keyword9

Keyword1 Widgets
Red
White
Blue
...

I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"

Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!

That's just bad site structuring.

I HATE THIS 950 POS!

I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.

"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]

"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster

"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster

So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.

Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...

Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?

Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.

That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."

I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.

Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?

p/g

"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g

[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]

 

Robert Charlton




msg:3645492
 8:05 am on May 9, 2008 (gmt 0)

Anyone else notice a sudden release of 950ed pages back to the top? I have, and I'm wondering whether it's a singular effect on one site, or whether others are seeing some sort of relaxation of the filters.

It may also, of course, be temporary... but I haven't seen a jump quite like this one in many months... and I've only seen one like it before, ever.

steveb




msg:3645522
 9:06 am on May 9, 2008 (gmt 0)

No 950 changes seen here.

andylc0714




msg:3645572
 10:17 am on May 9, 2008 (gmt 0)

It consistenly happens, one page out, one page fell in, so frustrated...

OnlyToday




msg:3647942
 3:31 pm on May 12, 2008 (gmt 0)

On Apr 26 I reported here that I had reduced all my pages to less than 100 links including internals by removing a substantial portion of my nav menu. 3-4 days after doing that my traffic dropped consistent with my having lost PR transferred with the internal links.

As of today there is no lifting of a penalty so I will be replacing the internal link menus. Another failed experiment...

webastronaut




msg:3648105
 6:28 pm on May 12, 2008 (gmt 0)

Losts of competitors I see came out to first and second pages of results and seemed to have knocked me to result 11 right now anyways.

Lts95




msg:3648160
 7:40 pm on May 12, 2008 (gmt 0)

Anyone else notice a sudden release of 950ed pages back to the top? I have, and I'm wondering whether it's a singular effect on one site, or whether others are seeing some sort of relaxation of the filters.

I don't know if I'd assume a relaxation of the filters when it could also be owners fixing their sites. Part 13 was the lucky number for me and I finally found info in this 950 thread that helped me figure out what was wrong with some of my pages.

chelseaareback




msg:3648162
 7:41 pm on May 12, 2008 (gmt 0)

dont really see a penalty lift at all from where I look I am afraid. Leaving our site half 950'd and half not. Number one for loads of terms and 950 for lots of others. and there are some fairly big players joining us at the end of the line party right now

SEOPTI




msg:3651245
 8:33 pm on May 15, 2008 (gmt 0)

No penalty lift here :(

HoHum




msg:3651432
 12:18 am on May 16, 2008 (gmt 0)

My main site is basically bouncing and always has done - since the 950 started about 1.5 years ago. There doesn't appear to be any pattern to the timing but at worse its usually up for a few weeks (say 10-15k visitors/day) then down for a week (5k visitors/day). When down it does stay number one for a lot of searches but drops for the most competitive phrases and lots more longer tail stuff.
The one thing we always see - always - is that before the drop we have a peak in stats. Not a global maximum in visitors but certainly a local one. So our stats are like a sawtooth waveform. Boom and bust. Anyone else like this?

dial_d




msg:3651465
 1:43 am on May 16, 2008 (gmt 0)

OnlyToday,

Has Google re-cached those pages yet? If not, wait until they do. I had a site which suffered from a penalty due to the navigation with about 120+ links in it.

Once G re-cached those pages, the penalty was lifted one page at time until all of the pages were re-cached.

HoHum




msg:3651736
 11:19 am on May 16, 2008 (gmt 0)

Having more than 100 links per page cannot help, but I don't believe (from experience) that reducing links is a long term solution. A major change like altering nav appears to trigger a re-evaluation which temporarily sorts the problem. I guess if that is the only problem with your site though...

Maybe its not just the links but the associated anchor text for the links reducing keyword overloading per page?

Anyway, as I said two posts above, just before we drop we see a significant peak in visitors. this is also associated with an increase in bounce rate / decrease in time on page. Basically because people are wrongly finding our site for searches that aren't relevant to their need.

We haven't targeted these keywords, there is no deliberate action on our part, but its almost as if google turns up the notch and looks to see which websites break and then punishes them. things then improve as the searches become more targeted and eventually you come back and so the cycle continues. Its like a feedback loop where the gain is a little too high and never settles down to a stable level.

I'm not saying this applies to all sites but its just my thoughts on trying to make sense of 18 months of not knowing from one day to another whether our stats will be up or down irrespective of what changes we make to the site.

[edited by: HoHum at 11:21 am (utc) on May 16, 2008]

OnlyToday




msg:3652216
 10:02 pm on May 16, 2008 (gmt 0)

OnlyToday,
Has Google re-cached those pages yet? If not, wait until they do. I had a site which suffered from a penalty due to the navigation with about 120+ links in it.

Once G re-cached those pages, the penalty was lifted one page at time until all of the pages were re-cached.

After 16 days all the pages should have been cached. Since this penalty nonsense began I've been looking for a better way to make money and I have finally found one. So Google and the Internet have lost my talents, at least for now. My site still does make some money so it will remain, but it won't be improving any more.

Robert Charlton




msg:3652734
 8:24 pm on May 17, 2008 (gmt 0)

...its almost as if google turns up the notch and looks to see which websites break and then punishes them. things then improve as the searches become more targeted and eventually you come back and so the cycle continues. Its like a feedback loop where the gain is a little too high and never settles down to a stable level.

I've been looking for some sort of physical or mathematical analogy for a while now, some sort of metaphor to grab onto, and it had occurred to me that -950 effects suggest a state of unstable equilibrium. The idea of a positive feedback loop is consistent with this.

Just spinning my wheels on this metaphor for a moment (and throwing in a few more ;) )... for a page caught in such a loop, you want to dampen down the oscillations. If you reduce amplitudes to do this, you risk reducing certain desired amplitudes.

Jumping to a physical model to illustrate this... if I had a tall narrow pyramid that was oscillating and in danger of falling over, reducing the height might be one way of stabilizing the structure, but I'd end up with a shorter pyramid. Broadening the base would be a much better way to create a stable yet tall pyramid. It's not always easy, of course, to broaden the base. That can take a lot of labor and moving of material.

And taking it a step further... if you conceptualize the algo as a multi-dimensional system of many pyramids, then, once you reach a certain "height," there are more "bases" that need to be broadened.

It's almost as if Google, the Earth Quake Maker, is challenging the Pyramid Builders, and saying, "OK... if you want your pyramid to be that high in our rankings, then it's got to survive many trials."

I'm not saying this applies to all sites but its just my thoughts on trying to make sense of 18 months of not knowing from one day to another whether our stats will be up or down irrespective of what changes we make to the site.

I've mixed a lot of metaphors here, but I toss them out in response, hoping they might elicit some useful thoughts.

mirrornl




msg:3652762
 9:38 pm on May 17, 2008 (gmt 0)

Go on Robert!
If necessary we'll pay more!

tedster




msg:3652812
 10:39 pm on May 17, 2008 (gmt 0)

Some very helpful thinking tools there, Robert. It reminds me of question someone asked a few months back - "How much PR is needed to support a site of "this-many" pages?

steveb




msg:3652922
 6:33 am on May 18, 2008 (gmt 0)

How many pages for how much PR is a fascinating topic for a thread, but not relevant to 950.

tedster




msg:3653147
 4:50 pm on May 18, 2008 (gmt 0)

You're right, steveb, sorry for confusing the issue by pointing off-topic.

if you conceptualize the algo as a multi-dimensional system of many pyramids, then, once you reach a certain "height," there are more "bases" that need to be broadened.

This puts my mind back to one early case of the -950 that was fixed within days after receiving a new, high quality backlink with the penalized phrase in the anchor text. In that case, a new backlink "broadened the base" in Robert's metaphor, and the penalty was removed with out any on-site "de-optimization". However, that site then also made some further improvements in the menu area as preventative measure for the future.

I often remember that early experience with the -950. The site was pretty much "squeaky clean" on the guidelines, so the -950 penalty was a real shocker. I wonder if the evolved -950 of today would even have penalized it in the first place.

potentialgeek




msg:3654452
 1:11 pm on May 20, 2008 (gmt 0)

The evolved -950... I wonder how this monster looks today.

This morning I noticed that for a search of:

The best red widgetts

I am still number 1.

But for the search of:

The best red widgets

I am nowhere.

I wish more people had spelling issues, lol!

The -950 penalty is for specific words or exact phrases and unlike the Google Search Results Page which offers spelling corrections this penalty doesn't consider them.

I also noticed, speaking of anchor text links, Tedster, my 950d site still gives out good link juice even while it's in the 950 tank.

Anyway, minor anecdotes aside, and on the issue Robert raised...

I'm thinking my site doesn't have enough text content on page--it doesn't have a broad base on text--and Google thinks it's too thin to justify so many pages. Or the link:text ratio isn't acceptable. It's "tall" on links and "narrow" on text, to use your metaphor.

So I might write more text as the next recovery attempt.

Is there a pattern with the 950d sites wrt to amount of content? Are the penalized sites mostly "light" on content?

My site attempts to be very broad (it gets found by over 40,000 different keyword search strings per month), but this broad range may have appeared to be spam to Google. We know spammers try to cover a broad range of search terms--as broad as possible, really--"all things to all people."

So perhaps the issue is you need to stop being a jack of all trades, so to speak, and narrow your focus. A master of one. Or two, or a few. Split the site into a number of more focused topics--on other sites.

I'm guessing Google sees some phrases as so similar you're trying to "split hairs" and catch traffic that you can't justify having.

Is anyone else with a "broad" site getting zapped with the 950?

I think I may need to add more "depth" to balance out the "breadth."

Another thing is I'm not convinced yet that Google is able to separate internal links correctly in its -950 penalty. So it sees header links as unjustified repetitive anchor text trying to game the algo when you're in a site directory where the Header Links are "Duplicates" of the Directory Menu.

e.g:

Home Red White Blue

Red
A
B
C
D
E

That "Red" duplicate is a potential problem Google should be able to recognize as just basic header nav, but maybe it doesn't.

So I could remove all the headers off 2,000 pages and have each directory linked from the home page. (I already removed all footers off 2,000 pages months ago.)

One basic navigation structure I use on other sites Google accepts, and what I think it really likes is: Collapsible Menus.

Example

example.com/a/

Home
A
1
2
3
4
5
6
B
C
D
E
F

and

example.com/b/

Home
A
B
1
2
3
4
5
6
C
D
E
F

So every main menu is linked site wide, but only the parts of a directory are visible when you're at that part of the site. (No overlapping keywords in the sidewide menu.)

This is of course the navigation format Google uses in Google Analytics. It also seems to be the most natural and logical navigation structure and nothing about it seems spammy. And it's probably fair to say the industry standard. Some of us, though, built our 950d sites long before we knew the standards.

p/g

tedster




msg:3654486
 1:36 pm on May 20, 2008 (gmt 0)

I also noticed, speaking of anchor text links, Tedster, my 950d site still gives out good link juice even while it's in the 950 tank.

Thanks for that. if I'd been asked to predict, I would have guessed that link juice would be fine - because the -950 appears to be a "last stage" re-ranking of the intial search results. but it's good to have the confirmation.

...the link:text ratio isn't acceptable. It's "tall" on links and "narrow" on text, to use your metaphor.

Although it's not a universal trait among all -950 examples I've seen, the long menu is very, very common. I think that anchor text is weighted much more heavily, and many links on every page give much more opportunity to croww the "over-optimization" threshold.

It is rare for me to see a tightly structured IA and menu system catch the -950 blues, but it has happened at least once when keywords are repeated many times in both menu labels and backlink anchor text. In that case, the -950 hit just one search phrase - the biggest one, of course.

Even if the "long menu" isn't a visible laundry list down the left side, trouble can still be had from an over-stuffed "mouseover" menu. Even if a full-blown -950 isn't triggered, repeating all that anchor text on every page can be problematic. It muddies the semantic signals that Google is looking for, I think.

misterjinx




msg:3654906
 10:48 pm on May 20, 2008 (gmt 0)

2 notes
1) A -950 position ALWAYS could not be a penalty.
Do you know the so-called "sine wave" effect of some serps?

2) if you have many links in the footer it seems you receive a penalization from "boilerplate" Google's patent.
And removing footer links will removing footer links your site will return on serps in few days.

[edited by: tedster at 10:57 pm (utc) on May 20, 2008]

tedster




msg:3654925
 11:06 pm on May 20, 2008 (gmt 0)

A -950 position ALWAYS could not be a penalty.

Certainly, but we have been discussing urls that previously ranked at or near the top. This summary [webmasterworld.com] and the Part 1 [webmasterworld.com] of this topic from over a year ago both mention that essential factor.

misterjinx




msg:3654944
 11:16 pm on May 20, 2008 (gmt 0)

In fact as you (and others) could read in the 2 removed URLs the "sine wave" effect affect results very well performing in SERP.

The so-called "sine wave" effect determines some pages (not the whole site!) will "fluctuate" towards a lower ranking position until establishing in the latest 30-40 positions near to 1000.

The site is in a -950 or nearest position only for some keywords, usually the long tail terms. For stronger keywords your page won't fluctuate.

Last note, instead of your site, in the better positions you'll find directories or sites linking your page, duplicate sites, site with a low or non-relevant quality.

Robert Charlton




msg:3655081
 7:03 am on May 21, 2008 (gmt 0)

More thinking out loud....

My metaphors of effectively broadening or narrowing a "base" are attempting to address the apparently paradoxical reactions most of us are seeing when we adjust ranking parameters on affected pages. These parameters can be onpage or off-page, onsite or off-site... and often when we change them, we see movement in the opposite direction than expected.

Though offsite confirmation is "broader" than onsite confirmation, Google's linking patents do indicate how some (offsite) backlinks might narrow certain bases... say by too many close anchor text matches, too "spikey" a link build, too many links from the same little network of sites. Conceivably, the link that pulled out the page tedster describes might have backfired... ie, a "backlink with the penalized phrase in the anchor text" might not have been the right thing... but I'm only guessing at that. It may be that a good quality link with the penalized phrase in the anchor text is what's needed in just about every case. As I consider the affected pages I'm seeing, linking is their obvious weakness.

In working with onsite elements, onpage optimization and nav link anchor text, the onsite base is necessarily narrow in some dimensions, more likely to produce paradoxical reactions to our optimization efforts. Although a site may have its own trust factors to work with, there is no trusted outside confirming opinion on links. You are in effect recommending yourself.

I noted last week (5/13) in the May serps thread, in response to a comment about the observed down-weighting of internal linking, that I was seeing "not just a down-weighting of internal linking, but in some cases almost a reaction against it...."

...It almost appears to be a phrase-related penalization... If your internal linking is too precise on a phrase you want boosted and your target page is on the brink of overoptimization for that phrase, then the internal links appear to backfire and actually push the target page down for longer phrases which contain those target keywords included in your internal anchor text.

In particular, what I think I've been observing is that the internal linking would boost the page to above a positive feedback threshold, and then instability would set in and the cycle of reassessment would start over. It may also be, though, that these were simply pages whose time had come, as Google is moving from looking at more competitive phrases to less competitive phrases.

The so-called "sine wave" effect determines some pages (not the whole site!) will "fluctuate" towards a lower ranking position until establishing in the latest 30-40 positions near to 1000.

I should add, btw, that, together with the -950 position, I'm lumping in the up and down cycling we're seeing in general, and I'm somewhat agreeing with misterjinx's comments here. I haven't read what he's referring to, but there is a cycling that drifts down or up, depending on which way the page is moving, on the same pages for which I've seen the -950.

I've been watching for over two years now some pages where some extremely competitive phrases have cycled all the way down to -950 and then risen back up... generally more slowly, but occasionally making a giant leap up... and then the cycling starts over again. On these same pages, I'm now seeing some less competitive but related phrases starting the same patterns. Occasionally the pages have disappeared for the less competitive phrases, but so far they've always come back. I haven't seen these go all the way down to -950, but the patterns I'm seeing have been so similar that I feel the same kind of algo analysis is at work... with some extra goodies, probably Universal Search normalization, thrown in more recently.

HoHum




msg:3655107
 8:08 am on May 21, 2008 (gmt 0)

well it could certainly be a poorly constructed sine wave made out of two points - high and low. The pages (for me) either rank top 5 or last 5 of returned results.

[rant] Been hit again. Grrr there are only two sources of this info in the UK and we are one of them. [/rant]

To continue with the thinking out loud - Google has a lot of info on sites now about visitor behaviour in analytics (which we can see in the benchmarking beta) and content makeup / link profiles (bots). The bases Robert is talking about could be viewed like a multi-dimensional 'shape' of a website (though don't try and visualise it). Depending on the niche or search term google expects the returned sites to have a certain average profile with standard deviations about it.

So it appears you tick lots of boxes for many dimensions and fulfill the relevance and trust criteria but something about the site is odd wrt to the mainstream profile. So, maybe, just maybe the algo gives the benefit of the doubt (in some cases), lets you ride high, monitors user behaviour (analytics, back-button analysis) and feeds that info back into the calculation. This is negative feedback in that it is trying to stabilise the serps position. Maybe that could give the sine wave effect that certain sites see.

The problem for my site is that we never stabilise! We are either too high or too low. What triggers a re-evaluation from either end. We have seen many cases where we have either gone from #1 to #950 and then #950 to #1 without making ANY changes to the site - why because we are stumped with what to do.

steveb




msg:3655844
 10:16 pm on May 21, 2008 (gmt 0)

"The -950 penalty is for specific words or exact phrases and unlike the Google Search Results Page which offers spelling corrections this penalty doesn't consider them."

It's interesting that your page would react this way. I have a page that now can not rank for anything. I could change all the text to being about goat's milk, and link to it with text about Gettysburg, and it wouldn't rank. The URL is poison because it USED TO BE ABOUT the poison phrase.

SEOPTI




msg:3655933
 12:54 am on May 22, 2008 (gmt 0)

I often crank out duplicates of sites in the -950 (slightly rewritten). It's easier to spam Google with duplicates than to try to heal this -950 stuff.

OnlyToday




msg:3655983
 2:59 am on May 22, 2008 (gmt 0)

...to try to heal this -950 stuff.

I do feel that my good will has been unfairly abused by Google.

joost




msg:3657212
 10:25 am on May 23, 2008 (gmt 0)

Going back to the +100 links/page part, on my front page I have a (non-javascript) drop down box with links to all my 400+ pages in this manner:
<form>
<select name="toclist">
<option value="blue.html"> Blue</option>
<option value="red.html"> Red</option>
<option value="white.html"> White</option>
<option value="black.html"> Black</option>
</select>
<input type="button" name="tocbtn" value="go"
onClick="location.href=form.toclist.options[form.toclist.selectedIndex].value">
</form>

The greatest thing for my visitors, but does Google read that as 100+ links, and will it give penalty for it?

My site does have the penalty, probably for several reasons, and I'm cleaning it up with help of the discussion here.

Should I also remove this drop down box?

HoHum




msg:3657256
 11:20 am on May 23, 2008 (gmt 0)

I would be certainly consider splitting the list up just from a user accessibility point of view. 400 links in one drop down must be daunting to a user clicking on it!

I don't know whether google can read that 'link'. I've seen poorly crawled sites with this type of link. You could test it by creating a page thats only accessed via this drop down?

I create my short drop down list (<10) of related articles after the page has loaded using ajax. mostly because I don't want to 'pollute' the top of my content with links.

joost




msg:3657290
 11:55 am on May 23, 2008 (gmt 0)

I'm sorry I was unclear, I actually have several of these boxes on the page, so user friendlyness is not the problem. Nor is indexing the problem, because all pages are connected to home pages in 2 steps apart from the drop down boxes. Problem is: does Google see links in a drop down box also as links, and will will penalize because of having more than 100 links on 1 page?

errorsamac




msg:3657331
 12:51 pm on May 23, 2008 (gmt 0)

It's been a little while since I've posted in the 950 thread, but I just noticed one of the larger sites I run get hit and I would just like to clarify that this is a 950 issue and not a different penalty.

Domain: www.example.com / PR5 / Lots of authority links from sites like nytimes, cnn, cnet, etc.

The page that I expect to rank is the main index (www.example.com/). When doing a google search for "example", I am no where to be found (not listed at all in the first 1000 results, even with omitted results included). A search for "example 2008" and the site is the first result. Also, other pages on the domain rank as expected, just the main index page is not showing up in the results for the domain name (which is also the keyword). A search for "example.com" brings the domain as the # 1 result as expected.

The content of the site has not changed since January of 2008 (it's a static/seasonal site). Does this sound like a 950 penalty or is it something else? The site got hit last year around this time, except a search for "example" made the domain show up as the # 10 result instead of # 1 and all of the other pages in the domain where way back in the SERPs. This is basically the opposite (except the main index isn't found anywhere), so I am just wondering if it's the same penalty or what.

Edit: Also, a search for a unique sentence on the main page (which does not contain the keyword at all) does not return the main page (www.example.com/).

This 212 message thread spans 8 pages: < < 212 ( 1 2 3 4 5 [6] 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved