homepage Welcome to WebmasterWorld Guest from 54.166.66.204
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 161 message thread spans 6 pages: < < 161 ( 1 2 3 [4] 5 6 > >     
Google's 950 Penalty - Part 11
Marcia




msg:3401658
 4:22 am on Jul 23, 2007 (gmt 0)

< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

Just saw one 950+ and it does my heart good to see it.

User-agent: *
Disallow: /theirlinkpage.htm

No, I'm not saying that's necessarily why, but it would serve them right if it was, for playing dirty like that on a page that's supposed to have reciprocal links with people exchanging fair and square in good faith.
======================================================
Added:

And another 950+, the last site in the pack. Flash only (not even nice) with some stuff in H1 and H2 elements with one outbound link. class="visible"

<style>
.visible{
visibility:hidden;
}
</style>
=========================================================
Another way down at the bottom is an interior site page 302'd to from the homepage, and isn't at all relevant for the search term - it must have IBLs with the anchor text (not worth the time to check).

Yet another must also have anchor text IBLs (also not worth the time checking) and simply isn't near properly optimized for the phrase.

So that's four:

1. Sneaky
2. Spam
3. Sloppy webmastering
4. Substandard SEO

No mysteries in those 4, nothing cryptic or complicated like some of the other 950+ phenomenon, but it's interesting to see that there are "ordinary" reasons for sites/pages to be 950+ that simple "good practices" and easy fixes could take care of.

The question does arise, though, whether the first two are hand penalties or if somethings's been picked up algorithmically on them - in one case unnatural linking, and in the other, CSS spamming.

[edited by: Marcia at 4:46 am (utc) on July 23, 2007]

[edited by: tedster at 9:13 pm (utc) on Feb. 27, 2008]

 

kdobson99




msg:3446026
 4:35 am on Sep 10, 2007 (gmt 0)

In my niche I have noticed that 13 out of 15 sites that I can clearly identify as being hit by -950 for a geo (state) search term (state + keyword) all have one thing in common...

13 of the 15 all have sitewide links to the pages for each state in the footer or bottom right navigation column. Some have anchors like "State Keyword" while others have only the state name in the anchor. It effects them the same.

When looking at the top 30 results in the niche, only 1 has these links completely site-wide and it's stranglehold is so firm on the number 1 spot that a nuclear bomb couldn't move them.

3 other sites in the top 3 appear to have the same types of geo targeted links but these links are different in two ways. First, they are more in the center content section of the page and second the sites cover multiple but slightly related niches. Although they appear to have the same kind of links, when the topic of a page changes from topic 1 to topic 2... so do the links. In other words, on the doctors pages the links point "State Doctor" pages, while on the hospitals pages they point to "state hospitals" (again, I'm not in the medical niche.)

km1974




msg:3446688
 8:17 pm on Sep 10, 2007 (gmt 0)

The -950 penalty has been around for ages, I am affected by it, I am not happy!

carlitos




msg:3446721
 8:54 pm on Sep 10, 2007 (gmt 0)

Googlers never cease to surprise me with their inventive ways to tackle hot issues like the 950 penalty. I suppose that the 10% of free time they have to get "creative at work" are very dangerous.

So whats is new you wonder?

Well, in order to transform the 950 penalty into something less obvious, these people now have put the penalty a little bit more randomized, sometimes in the 600's, other times around the 500's, 200's, 50's.

Its not that the algo has been optimized, its just that they have added a little bit of a random effect so that it doesn't look so obvious that they have brutally obliterated decent, well-kept and content-rich and unique sites. Thats the way these guys are fixing these hot issues, just like Windows: patch after patch after patch.

I'm telling you: Google is resembling more and more our antique dictator, you know who.

"Do no evil" C'mon guys! for what do you take us?

tedster




msg:3446867
 12:16 am on Sep 11, 2007 (gmt 0)

I've long suspected that other affected URLs might be buried not quite so deep as the last page of results. The "penalty" mechanism (to me at least) appears to be a re-ranking of the preliminary search results, done not by a subtraction of a number of positions, but rather by a fractional multiplier applied to the original relevance score. The fractional multiplier gets applied when some criteria they are measuring goes beyond the threshold.

So yes, I often suspected that various multipliers migh be applied. The original relevance score could be multiplied by .99, or .71, or .25 or anything. In fact, the fractionalmultipliers could even be applied to just one partial factor that is used as part of the final relevance score - deeper in the "recipe", if you will.

It's hard as it can be to make sense of it all.

gehrlekrona




msg:3446886
 12:40 am on Sep 11, 2007 (gmt 0)

Carlitos,

I wasn't sure about the 950 penalty because my site ends up in the 800's or so but your theory is probably right. To hide the obvious they randomize it instead so people gets confused and stop looking for 950 penalties. For all we know is that we disappear and if you are not in the first 3-4 pages, then you are a nobody and might as well not be there at all.

One thing I REALLY do not understand is that you can be on top one day, the next you are gone just to come back for a couple of days and then disappear again. You would think that after coming back you would stay there but it seems like they are just testing and testing and testing.... and nobody checks the result. Does lab rat comes to mind?

[edited by: tedster at 2:21 am (utc) on Sep. 11, 2007]

carlitos




msg:3447085
 7:15 am on Sep 11, 2007 (gmt 0)

Exactly gehrlekrona, testing with other people's hard work.

Interesting thinking Tedster. A multiplier to send you to hell ... well, depending on the day. G is so full of rubish lately.

steveb




msg:3447207
 10:43 am on Sep 11, 2007 (gmt 0)

That's a different penalty. Pages that drop several hundred places for some terms but maybe ten spots for others are not being hit with a 950 penalty so it is wrong to try to combine the two. There is always a tendency here to think "one big thing" is responsible for everything, and it makes all discussion useless.

The 950 penalty is not the -30 penalty and it is not the "kneecapping" penalty that guts the rankings of a page, but not in a consistent way.

Miamacs




msg:3447245
 12:25 pm on Sep 11, 2007 (gmt 0)

Yeah.

But no.

Pages hit with -950 actually do show up sometimes at different places within the SERPs. (-1/4, -1/2, -3/4 etc, in other words: below -140 can be a suspect ). This indicates movement in their ranking within the still penalized ranges. -950 is both the first and the last stage before they come out of the twilight zone.

...

Which in combination with the fact steveb points out that there ARE penalties / filters / whatever that'll make your pages drop to similar positions ( ie. -30, -50, -80, -120, -180 ) it's quite hard to identify in its advanced stages. But if you track their rankings it becomes obvious. Based on their movement you can guess, and with a good accuracy, which one of those are -950.

There's no such thing as 240 Thusday, 886 Friday, 459 on the next Monday, and 213 the next Thursday morning. That's a threshold/your profile being adjusted in a volatile area. And if the original positions were top 20, and rank history shows the -950 ( ok, 'end of results' ) ranks as well... those in between are but but stages of the same sickness.

...

[rant coming up]

...

Having read many posts, having seen many sites there are several things I'm convinced of here, and don't get me wrong.

- This is 100% a relevancy based filter. The end. I've yet to see ANY examples that'd prove otherwise. Sometimes the problems arise from completely unrelated errors, mistakes, but in the end, this filter is based on (ir)relevancy, and only that. ( unrelated keywords, unrecognized themes, irrelevant anchor text in inbounds, and/or from irrelevant sources, irrelevant or too generic titles/internal anchor text, no exact match for phrases, no variety in phrases, closely monitored phrases, too competitive... etc. etc. etc. )

- This -950 is not *really* a penalty nor a filter. I'm only addressing it as such because that makes it easier to relate to. It's but the devaluation of links, and the relevance (PR?) that came with them.

- Right now collateral for the -950 pfenalilter is minimal. Most of the sites I knew to be completely legit, authority sites... are out. Some of them did nothing. Some of them had to fix things first. Google too had to adjust the filter, and they did so.

- Most of the sites that get caught up since then have an unhealthy, unnatural, unbalanced, sometimes downright spammy link profile. Meaning: icky sources /and-or/ simplistic anchor text.

- Add a navigation with repetitive and/or colliding, irrelevant, competitive, spammy anchor text.

...

[rant continues]

- People have a hard time grasping what 'relevancy' means.

- Or are lazy to read the posts here. No wonder, there were 11 threads.

I can only think of one good source for definitions here, which will do a lot more to most webmasters than reading through all 11 threads at this point. ( Except for those with legit sites tripping the 'unknown theme' version of the filter ). Look, the -950 filter has evolved to a point where I can guess at an instant what it will do, for it's now pretty close to how I think of websites.

...

And inches closer and closer to these classic, albeit vauge guidelines:

- Have other relevant sites link to yours.

(...)

- Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

(...)

- Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

- Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

(...)

- Check for broken links and correct HTML.

(...)

- Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"

(...)

- Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.

...

New golden rule for webmasters.

Listen, and honestly. Don't get me wrong.
I guess Google is a business after all, first they bluff their way ahead, and then, slowly live up to their own standards:

If it can be identified by a human, it can be identified by Google.
If not today, then tomorrow.

( yeah, um, btw hidden text works. Period. )

Google fortune telling
if it went on like this, then...

- Most 'for the heck (/money) of it' blog posts, and even entire generic ( non-themed ) blogs will fade from the SERPs in the coming months if you ask me.

- And so will the sites that relied on links from these sources. At least for popular, competitive phrases, to which, again, honestly, there are better sources than these.

- Relevant anchor text links from Irrelevant sources will be just as bad as irrelevant links from irrelevant sources. I'm still hoping that Google will be ( or is already ) capable of telling when the source and the target is relevant, even though the anchor text is dumb.

- There's no such thing as quality link without relevance.

- Get a bunch of links from a trusted/high PR source saying something else than what you are and you're finished. Unless you switch your company profile.

- Wouldn't sit too comfortably if all links are from generic, unknown link directories either.

- A THEME or themes a site is relevant for will become a major factor. Oh... well, sorry, it has been a factor for ages, and a major one for more than half a year now, wake up.

- If you can't grasp what's relevant and what's not in your targeted language... meaning what is competitive, what are monitored words, ie. words to include, and words to *evade*... you won't be able to do SEO anymore.

...

One last thing.

-950 isn't a penalty.
It's link devaluation.

It only takes away ranks that Google didn't want to give in the first place ( +/- the collateral ). If you have quality content up, no technical, accessibility issues, all you need is a few links with real quality ( as opposed to the above examples ) to get you going again. So don't panic.

And believe me, it's not in the numbers.

...

wasn't this actually two posts and a new topic?

[edited by: Miamacs at 12:28 pm (utc) on Sep. 11, 2007]

Biggus_D




msg:3456773
 6:26 am on Sep 21, 2007 (gmt 0)

So our sites suck, have no trust, are badly designed (SPAM, SPAM, SPAM) and in a few words at the end of the day we deserve it.

Well, then how on earth this week one of our sites has improved by 20 (yeah!, 20 times higher) the Google referrals?

Did we win the trust lottery this week? (no changes have been done to this site since this 950 thing started several months ago).

I can't take this thing seriously. I want a Google Ombudsman.

simonmc




msg:3456830
 9:38 am on Sep 21, 2007 (gmt 0)

¦Google creates most of the spam themselves.

1. Google trumps ...Blogs are the new link power.
- Spammers hound the hell out of blogs

2. Google trumps ... Social bookmarking are the new link power.
- Spammers hound the hell out of Social bookmarks.

3. Google trumps... What ever they trump out next.
- Spammers will hound the hell out of it.

Google is responsible for devaluing the usuability of these types of sites by driving the spammers to them.

Who cares though ...google shares are UP UP UP :)

Simon

Miamacs




msg:3456902
 12:09 pm on Sep 21, 2007 (gmt 0)

Sure.

- Right now collateral for the -950 pfenalilter is minimal. Most of the sites I knew to be completely legit, authority sites... are out. Some of them did nothing. Some of them had to fix things first. Google too had to adjust the filter, and they did so.

...

For those who feel their hat is so white it glows in the dark / their site uses a semantically and thematically relevant navigation, clean, accessible pages, no irrelevant stuff, no affiliate links, no paid advertisements off-topic to or from your domain, no irrelevant blog entries on or about the page, no excessive use of repeated words, no venturing into competitive areas you've never had a link for, and no HTML-, Server-, accessibility errors whatsoever, in other words:

If you feel your site is legit, well designed, non-spammy, the top 3 suspects are:

1. Irrelevant, low value inbound links ( a number of links that got devalued this year, one after another )
2. Links of which the source is perfect, but the anchor text isn't recognized as fitting for your theme. "Almost good" links that don't pass relevance
3. Links in your internal nav that used an anchor text not recognized by Google to fit your theme

You say you got out, doing nothing?

That's Google adjusting stuff, their understanding of your theme. See point #2. They have been tinkering with thresholds and phrase sets all year, testing, filtering, with borderline sites jumping up and down the SERPs every four days.

A quick note about trust: -950 affects sites that are referred from trusted sources, but since their anchor text is problematic /and-or/ the source isn't relevant, it's not yet decided what they should be trusted for. This is a relevancy problem. If you didn't have any trust, you wouldn't be #950... you'd be *OUT* of the index.

...

20 times higher

No such thing. You're either still affected or out. Ranking for affected pages aren't moving in a linear fashion. They're all over the place. You can't even say higher is better. #340 is actually LOWER than #950. For example, did you know that the last result is in fact the first, but moved to the end of the queue? A few more good links and you're #1 from there. #60 is worth far less than #950.

...

But back on the "Our site is (not) spam" remark.

Don't take things personally.
But with all the sites I saw... I tend to be cautious.
Sorry if my post sounds like as if everyone whose site is hit by the -950 filter is trying to spam. That's not the case.

BUT.

I'm convinced - make that "my experience is" - that most ( 70-90% ) of the sites that remain in the twilight zone are sites you wouldn't want to land on as a user. Not malware, not spam, but... not value either.

This might not apply to you. See the above reasons for collateral.

...

Most of the sites sitting at -950 are either designed, worded, linked to the wrong way, have a faulty code, navigation or duplicate content or any other major problems, or simply used a marketing scheme ( paid blog posts, bought links for PR, links from high profile but irrelevant domains, spammy anchor text ) that resulted in a faulty link profile. Most of such sites were promoted in this way because a.: the then current trend in SEO, *or* b.: content that couldn't be marketed otherwise. Stuff drops out, links get devalued, whichever the case, something broke their relevance matrix/scores in half.

...

Again, it's not like *all* of this applies to you.
But right now either or more apply to most of the sites I see down there.

...

And Google having a lot of spam?
Sure, maybe, yeah, but this thread is about -950.

Did you see the posts...? That they actually plan to roll out new infrastructure, because of a hole in their system got badly exploited. I'll see how that change plays out, and if I feel my opinion would matter, I'll try to get my thoughts together.

BUT This -950 stuff isn't affected by their current struggle to get this new Google out of beta test mode... No sites I monitor to track changes in the activity of this filter show any irregular movement.

Some are in, some are out. Business as usual.
So if you're still affected, refer to my prev. few posts for clues on what to do.

But one thing always helps... get more quality, on-topic, relevant links.

bwnbwn




msg:3457151
 2:06 pm on Sep 21, 2007 (gmt 0)

What I find really weird about this penality is that it seems to only effect certian one word search terms and not 2 or longer terms.

I have been making changes in stages to see what might effect this penality the most it does look like I am crawling out of the penality.

Was stuck on the last page for a single term I had ranked for on the 1st page for 3 years.

Started making the changes and have since crawled to the top of page 80 been moving almost daily in the right direction. Started making the changes 2 weeks ago. After the page showed a cached date I make another change.

I will see were this goes and post my changes as soon as I see this is working, made another change the other day hasn't been indexed or at least not showing it has yet.

I will add I was sinking in some ranked 2 worded terms but since this changes it has helped them come back as well.

I know my main issue is duplicate content as this is an ecommerce site with different flavors but same description. I see websites that have moved up in the 1 worded search terms that all the Titles are the exact same and I have more pages indexed than them so I feel I can rule out the supplemental as being a culpert in the 950.

Supplemental I feel is reducing the long tail searches we all need to get the targeted traffic that converts. I have been rewriting these pages to get them out as well but when you have a couple thousand it does take time.

Miamacs




msg:3457167
 2:21 pm on Sep 21, 2007 (gmt 0)

What I find really weird about this penality is that it seems to only effect certian one word search terms and not 2 or longer terms.

You mean... In your case. Right?

Most sites affected see two, three and four word phrases sent to the back all the same. Depends on what you lack relevance for.

For your site it's probably the nav/title combination, possibly a dupe content kinda problem. And/or that you have a lot of good inbounds, but none, or too few that use nothing but the targeted keyword as the anchor. ( Single word, exact match quality links. )

bwnbwn




msg:3457211
 2:57 pm on Sep 21, 2007 (gmt 0)

Miamacs
You could be correct but a number of sites that have replaced the top serps for the 1 worded search term I am referring to have 0 a big nothing as far as back links.

Ran a link check one shows 1 link to a junk site in Google and only 200 in yahoo. I am showing 35 in G and 3500 in yahoo.

I am really wondering what the cause for this site to rise. It has the same Title tags for everypage 3500 of them only 191 pages not in supplemental. I ran the mysite.com/* check

I can find nothing that would suggest "trust" here. The behavior in the serps is really strange as far as how or why. It sometimes feels like a crap shoot and we are all rolling the dice.

kdobson99




msg:3457309
 4:21 pm on Sep 21, 2007 (gmt 0)

I was posting feverishly about 10 days ago in this thread. I bounced out and assumed my normal rankings about a week ago. I got a few good quality links, but it doesn't look like google had found them before I was let out. The only other things I did was to change the internal linking structure to be more vertical, rather than cross-linking deep internal pages. I also updated my copyright date as I noticed several 950 sites had old copyright dates in the footer, although I'm pretty sure that wasn't an issue... just something that several of the sites had in common and needed fixing anyway.

Miamacs




msg:3457312
 4:25 pm on Sep 21, 2007 (gmt 0)

As I said, all sites in the -950 area are in fact, 'trusted' based on their parameters.

-950 is a reranking system to battle off spam, with filtering irrelevancy, and other signals of manipulation within these trusted results.

This is a relevancy related problem.

...

Volume doesn't play a role here. It's about occurrance of phrases, their absence and their balance. All this in the (link) profile of the same site.

Scale down the numbers to match your competitors' and think in percentages for a moment.

Example ( and just an example, not analysis ): If within those 3500 links, 69% of all links would have the same anchor text ( which is competitive ) and not enough variants/supportive phrases to send a clear signal that yes, this is natural linking, that'll be a problem. eg. people rarely link with the same generic term to a website 69% of the time ( they do with brand names, they don't do with services/products/locations ) you'll end up having an out of balance link profile and your pages will be barred from competing until further notice ( until you get enough variants of your keyphrases in your links to suggest, it's not you who's arranging ALL your links ).

Let's say the competitor which has only 200 links as opposed to your 3500 has a very healthy looking, natural link profile, with about a 25-30% emphasis on its main keyphrase, another 15% for variants, plurals, even typos, 20% using their URLs as the anchor text, and a whole bunch of random words, related to the topic. If those 200 links are 200 quality links, there's nothing mysterious here. the 3500 link total probably includes about the same number of sources that *really* count.

...

This was a fictive example, your link profile isn't this unbalanced but... even a smaller irregularity can trigger problems.

And your site is trusted, if it wasn't, it wouldn't be at -950, but would drop out instead.

It's relevancy that's suffered a blow.

Co-occurrance is not only about **not including** irrelevant stuff, but also **including** words and phrases that Google would be looking for on a legit site. If your pages already feature these words, your next target should be analyzing your inbound anchor text. An out of balance link profile will send a legit site to the penalty box.

Gavolar




msg:3457334
 4:46 pm on Sep 21, 2007 (gmt 0)

I think Miamacs has hit the nail on the head...

I had a site that had the 950 penalty for about a year. I believed it was because all my links had the same anchor text.

The past 6 months I made sure all my new links had different anchor text in order to adjust the ratio.

My site is now back on the first page.

bwnbwn




msg:3457405
 5:49 pm on Sep 21, 2007 (gmt 0)

After I see these last changes I made are cached I will begin to follow the advice of Miamacs as I have been fairy inconsistant on changing my anchor text.

I do know it is really tough to get a good link in ecommerce to stick without having to be a police man and keep reviewing them monthly to see if they have started cheating or hired an seo firm (might add not all of them are just a few)that will use nofollow tags, dissallow in the robots.txt, or doorway pages etc. and then I find myself linked to a bad neighbor.

gehrlekrona




msg:3457417
 6:01 pm on Sep 21, 2007 (gmt 0)

Miamacs and others,

When you are talking about links and anchor text, are you talking about incoming links, internal links or both?

If it is incoming links, then it'll be hard to changes the ones you already have out there.

bwnbwn




msg:3457435
 6:19 pm on Sep 21, 2007 (gmt 0)

gehrlekrona
new links are what they are referring to.

steveb




msg:3457686
 12:08 am on Sep 22, 2007 (gmt 0)

"I had a site that had the 950 penalty for about a year. I believed it was because all my links had the same anchor text.
The past 6 months I made sure all my new links had different anchor text in order to adjust the ratio. My site is now back on the first page."

Then you are an exception. varying anchor text usually gets you a penalty, not the other way around.

Problems tend to occur because pages are too powerful and too relevant for too many things that are related.

A general way to start to address problems is removing synonyms, making anchor text consistent, making shortish titles with no two words related, and otherwise making a page seem about ONE thing. In other words, kill your rankings for everything but a single search.

europeforvisitors




msg:3457733
 1:50 am on Sep 22, 2007 (gmt 0)

Better yet, why not try the "organic" approach instead of trying to second-guess what Google might be doing at any given moment with its algorithms and filters? Follow the Google Webmaster Guidelines, provide Googlebot with digestible "spider food" in the form of descriptive titles and headlines, etc., and don't spend your time endlessly tweaking things like anchor text and keyword density. You might be pleasantly surprised by the results.

tedster




msg:3457751
 2:54 am on Sep 22, 2007 (gmt 0)

europeforvisitors, many people who woke up to find themselves with a -950 situation thought they were doing exactly that. But once bitten, they need something a bit stronger than generic advice about running a clean site to understand what is causing this surprising problem.

The -950 is such a dramatic phenomenon that many around the web sincerely doubted that it even existed when we first started discussing it here. One of the most interesting fixes I've seen is a new backlink with the penalized phrase used as anchor text. Throw that into the mix of approaches to this whole thing - it has worked for some, and quickly, too.

Gavolar




msg:3457778
 3:52 am on Sep 22, 2007 (gmt 0)

Steveb "varying anchor text usually gets you a penalty"

I keep to the same theme... I might add part of the url, plural/non plural, mix the keywords up, etc

europeforvisitors




msg:3457817
 5:02 am on Sep 22, 2007 (gmt 0)

One of the most interesting fixes I've seen is a new backlink with the penalized phrase used as anchor text. Throw that into the mix of approaches to this whole thing - it has worked for some, and quickly, too.

But for how long? And what if it backfires? Seems a bit risky to me. If clean sites are getting a -950 penalty because of collateral damage or some other screw-up at Google, wouldn't it make more sense to leave well enough alone and let Google fix whatever needs to be fixed? (And mightn't it be easier for Google to fix such problems if affected site owners weren't introducing new variables?)

simonmc




msg:3457857
 7:24 am on Sep 22, 2007 (gmt 0)

wouldn't it make more sense to leave well enough alone and let Google fix whatever needs to be fixed?

Some people run thier web sites as a business or as a job. That means they may well rely on thier site ranking well to put food on the table. ( a foolish business plan indeed ...but one that is frequent on this board and many others too).

So if you were in the situation where you relied on it ( I know you wouldn't be so stpid), hypothetically speaking would you just leave it alone?

It is a documented fact that google can take from just hours to never to fix problems generated by thier engineers. Where between these two time frames does this current issue lie?

Seems to me that the -950 penalty has been discussed here for at least a couple of months. Thats a long time with no food for some.

Instead of just offering up the google party line, why not either refrain from the unhelpful comments that just clutter up the topic or give some valuable advice.

steveb




msg:3457858
 7:25 am on Sep 22, 2007 (gmt 0)

"Better yet, "

Let's not go down the nonsense path. The only pages hit with the 950 penalty that I'm concerned with do follow the Google guidelines. That is the whole point. These are strong, genuine, well constructed pages that are accidentally being hit by a penalty aimed at bogus pages trying to mimic the genuine nature of the pages accidentally penalized.

The only reason to do what I suggested is because Google is screwing up by penalizing very ligitmate pages. If Google misinterprets a page following the guidelines as not following them, advice to just follow the guidelines is silly. In the cases of these particular pages, the answer is to continue following the guidelines, but simplifying the pages.

In other words, Google is acting stupidly confused. So make things as simple as possible, which makes it more likely (though gurantees nothing) that Google will not screwup and penalize pages it does not mean to.

(Advice for what to do with spam pages being penalized justifiably is entirely different, and covered at length by others in this thread.)

tigertom




msg:3457897
 10:19 am on Sep 22, 2007 (gmt 0)

Based on what I've read here, I think my pages got -950'd because:

1. I was tweaking them and tweaking them for years; putting keywords in the ALT and link TITLE text, image names, anchor text etc.;

2. Lots of keyword-rich links to other keyword-rich pages on said keyword-rich pages;

3. The above not supported by backlinks with the same keywords in them from other sites.

Maybe: 4. Wide and varying niches on the same domain.

Made drastic changes to the above, split content onto various domains, and the sites are coming back.

The main site was also being eroded unknownst to me by:

5. Pages going supplemental.

The safe solution to all of the above seems to me is to:

a) Dial down the optimisation
b) Change the navigation
c) Ramp up the on-theme backlinks carefully.

c) is always a good idea anyway. Time well spent.

I don't think the Google algo needs to be the work of mega-brains. All they have to do is to decide what the current footprint of a 'pretender' site is, and filter that. It'll be different, probably very different, from a naturally-popular web site.

Too bad if your site looks similar.

Miamacs




msg:3457937
 12:03 pm on Sep 22, 2007 (gmt 0)

Problems tend to occur because pages are too powerful and too relevant for too many things that are related.

A general way to start to address problems is removing synonyms, making anchor text consistent, making shortish titles with no two words related, and otherwise making a page seem about ONE thing. In other words, kill your rankings for everything but a single search.

...

Steveb, look at me, I'm crying, I hope you're satistied now! *grin*
I know what you meant, and you're right. ( If a synonim is a word that's not *considered* a synonim, and is monitored as a *separate* competitive keyword by Google, sometimes for no reason, you can't take it for granted that you'd rank for it, and will need to treat it as a separate, unrelated word. Obviously, you'll need inbounds for your site with it first. Also, n+1 ultra competitive phrases on the same page are spam, regardless of any other parameters. )

But you have worded this post to sound like the *exact opposite* of what could be considered good advice for *all*.

...

Lemme rephrase it just for fun.
*groan*

...

Problems tend to occur because you think your pages are too powerful and too relevant and you aim for too many things that are related in your opinion, but as for Google, they're yet another theme, yet another aspect, yet another business model, which you don't have authority in.

A general way to start to address problems is removing competitive phrases you are NOT relevant to ( money sensitive themes are *very* narrow! ), ie. remove competitive words you don't have a single inbound for, making anchor text consistent, making descriptive titles with no two unrelated words, and otherwise making a page seem about ONE ultra competitive thing at a time, or get more inbounds, raise relevance, watch the co-occurrance filters' reaction, and expand only if your pages are relevant. In other words, kill your attempts at rankings for everything but what you are relevant to.

...

Recently I started wondering... why bother posting all the stuff doing research on again and again, the methods of getting all of the sites out with, ...readings/understanding of the patents that preceded recent changes, track how they were put in practice and how they match up natural and unnatural phrase occurrance, link profiles, title/thematic/anchor text relevancy... if people don't really read them, don't have time/don't bother to understand, ( might be explaining it the wrong way too though ), and then keep on summarizing everything with posts like... uh... yeah.

...

So, no, please, don't...

Do not follow steveb's advice *word by word*, unless you know what it means. One aspect is about co-occurrance, ie. the presence of one too many ultra competitive phrases ( high ad competition means monitored in Google organic search too ) AND/OR using such competitive synonims in the title/navigation to which the page isn't relevant to in Google's eyes. It will only work if you for eg. have titles like

"Buy widgets, blue widget, iPhone, US flights, Holiday presents, Xmas is coming up, send widgets as presents"

...for every other innermost product page on your site.

"Buy blue widgets, Sell blue widgets"

...on a site relevant only to 'buy' not 'sell'.

"PCs, Personal Computers, Desktops, Laptops, Notebooks"

...on a site that's only relevant to two of these.

...

Natural, means using synonims as well.
No planning, no targeting, just describing everything naturally will include synonims. Don't go overboard, don't be 'creative', think about what Google's little AI will 'get'. And if it's a whole other theme, ( buy widgets and review widgets are different themes ) DON'T think it's relevant! Check AdWords for clues.

Natural, means variety in anchor text.
If you were running stupid, aggressive link campaigns that looked *exactly* what they were, links arranged by you, targeting keyphrases with all too much precision, you'll need to correct that.

...

...

Wow... I found myself almost posting all the possible scenarios yet again.

No, I think I'll give up.

...

MHes




msg:3457939
 12:15 pm on Sep 22, 2007 (gmt 0)

Got hit in December 2006. Fixed problems in Feb 2007 and been stable since March with top rankings again in competitive field. Our traffic whilst hit went from 24000 per day from google to 500.

Internal navigation causing high 'N' value for target keywords. N value is number of related phrases as per the phrase based searching patents. n.b. number of phrases - frequency of any phrase is unimportant.

The patients are at:
[appft1.uspto.gov...]

[appft1.uspto.gov...]

[appft1.uspto.gov...]

[appft1.uspto.gov...]

I still reckon this is the core of 950 and I've yet to see any evidence otherwise. The key is that a high N can cause a page to be treated with suspicion, but be redeemed by other factors if they exist. I still believe you can cure 950 by either reducing N or creating redeeming features for that page.

If some pages on your site are hit by 950, then others will lose ranking ( -230 etc) because the links from the hit pages will be worth a lot less.

Just my two pence worth and I think it worked for me.

steveb




msg:3458117
 6:06 pm on Sep 22, 2007 (gmt 0)

"But you have worded this post to sound like the *exact opposite* of what could be considered good advice for *all*."

As I've said, I couldn't care less about spammers being caught for the reasons google is trying to catch them. I'm not concerned with "all". How they get out of the penalty is ENTIRELY different, which is what you keep focusing on Miamacs.

Likewise how you rewrote what I wrote is 100% wrong for non-spamming pages hit with this penalty.

I can't say this clearly enough but suggesting spammers remove text they are not relevant for may be fine, but I could not care less. With non-spam pages the 950 penalty hits pages they ARE relevant for, that they ARE relevant for in the eyes of any human, that they DO have natural anchor text for internally and from other good sites.

Where Miamacs talks about "get more inbounds, raise relevance..." that's all 100% irrelavant to the non-spam pages. We already got that covered. It's a problem in no way at all. If you have low quality spam pages, you likely need to worry about these things, but for the non-spam pages effected that just bizarrely strange talk.

Non-spam pages hit are generally objectively among the top 10 pages in quality in terms of their multiple relevance areas. Authority non-spam pages about Abe Lincoln and George Washington are at risk if you have quality internal and external linking about about both. A way to make it more likely to get out of the penalty is either split the page in two, or get all your anchor text talking about Lincoln and giving up on Washington.

So if you are a spammer, you might improve your situation by following what Miamacs says, but if you have a well-respected site where pages are recognized as authorities by Google and any human for good reason, then you have completely different issues. Miamacs addresses pages that DESERVE to be penalized. What I am talking about, and the reason these threads began, are pages that literally nobody would consider something that should be penalized... solid pages with solid linking, often having a broader coverage of a topic (thus more synonyms).

There really should be two threads, one for the pages penalized correctly, and one for those penalized by mistake.

(One example of the issue... spam pages often have tons of links from non-relevant sources. The top quality pages also have tons of links from non-relevant sources -- because they have been scraped from search pages for MULTPLE search terms. Obviously sometimes Google will confuse these two phenomenona as the same thing, and sometimes rank the crap page well, while sometimes penalizing the authority page.)

This 161 message thread spans 6 pages: < < 161 ( 1 2 3 [4] 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved