homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 194 message thread spans 7 pages: 194 ( [1] 2 3 4 5 6 7 > >     
Google's 950 Penalty - Part 7

 10:18 pm on Apr 13, 2007 (gmt 0)

< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I read in another thread that you wrote that you have a recip links page. That is probably what is causing your site some grief.

No, it certainly is not. annej's SITE is not having any grief whatsoever. There are simply some individual PAGES that are not ranking for the chosen keywords.

In addition, having reciprocal links (or a recip links page, or even a whole directory with links) is NOT what causes this phenomenon. There are sites with reciprocal link pages and even directories with a percentage of recips that are untouched and have top-notch rankings. And that is a verifiable fact.

Remember, the algo is completely automated with very little human input. You probably need to take a long hard look at who your linking to and if they are spamming.

This has nothing whatsoever to do with OBLs and nothing whatsoever to do with link spam.

Remember, Google guidelines state not to have your site link to bad neighborhoods. If one of the sites you are linking to is spamming Google, it can have a drastic effect on your site. Check to see if all the sites you link to are following Google guidelines. If they are not, you might want to drop that particular link.

Linking out to ONE? Did I read that right and/or interpret that correctly? Or am I seeing things? Where in the world did that theory come from?

If a site is SPAMMING by a pattern of linking out to bad neighborhoods, it'll cause a problem with the SITE - not individual content pages that are simply not ranking. This is not the case, not by any means.

I don't know how many times it has to be repeated and requested to please not try to accuse anyone with this phenomenon of somehow spamming, because there's no basis in reality and it can cause unnecessary stress that's unfounded and unjustified and without basis. Trying to help is always appreciated, but this is serious, it's no place for folks to be chasing windmills.

[edited by: tedster at 9:16 pm (utc) on Feb. 27, 2008]



 10:34 pm on Apr 13, 2007 (gmt 0)

There are some pages that are getting "clustered out" because of semantic redundancy. Not duplicate content, redundancy. There's a big difference between the two.

[edited for spelling]

[edited by: Marcia at 10:36 pm (utc) on April 13, 2007]


 10:42 pm on Apr 13, 2007 (gmt 0)


Annej wrote that in this thread:


As far as the links go, straight from the webmaster guidelines second sentence:

Don't participate in link schemes designed to increase your site's ranking or Page Rank. ****** In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.*****


 10:59 pm on Apr 13, 2007 (gmt 0)

BTW, Annej I am NOT saying your spamming! One of the sites that you might have a recip link to/from might be though! Its worth looking into.


 11:04 pm on Apr 13, 2007 (gmt 0)

> "clustered out" because of semantic redundancy

Hi Marcia, your thinking here sounds interesting. Could you expand on this so we could better understand what you mean here?


 4:46 am on Apr 14, 2007 (gmt 0)

I don't think link exchanges have anything to do with this penalty but it does make sense to check out the sites you link to now and then. Sometimes a domain that was good is in new, not so savory, hands.

I've mentioned this before but will repeat. I have only lost isolated pages or in a couple of cases the pages in a sub-directory. The frustrating thing is that when a page is missing other pages from my site will rank near the top. Kind of like they take the first page's place. Often the page that does well is a page that links to the missing page. So it has to be something about the specific page and not the overall site.


 5:30 am on Apr 14, 2007 (gmt 0)

>>Annej wrote that in this thread

Well, then read what tedster wrote a couple of posts after you posted about recips and penalties - read again, if you missed it the first time.

Recips will NOT cause penalties or grief for a site, this is perpetuating an SEO superstition that could be very harmful and will frighten people for abolutely_no_reason. It's the WAY some sites go about their linking that causes them grief - not doing a balanced linking program that includes reciprocal links.

There is nothing wrong with the way annej described her links, in fact it's a good thing for a site to do, if it's on topic and done so that it benefits users - and is in balance.

>>One of the sites that you might have a recip link to/from might be though! Its worth looking into.

It's definitely worth looking into and checking links out now and then (as long as spyware and anti-virus programs are functioning and up to date - for real), if nothing else to keep them up to date and make sure they're still safe for visitors to click on. But a small percentage of links that go bad after a period of time will not cause a problem. I can personally vouch for that.

Not linking to "bad neighborhoods" became included in guidelines right after the massive PR0 penalties hit back in 2002 - and those hit the whole site. Whole 'nuther issue entirely, and there was something else going on at the time with some software that was widely distributed and in use.

Besides, links having nothing at all to do with what's going on now with people's individual content pages being filtered out for reasons of "semantics" and/or the related topics that are currently under discussion.

The frustrating thing is that when a page is missing other pages from my site will rank near the top. Kind of like they take the first page's place. Often the page that does well is a page that links to the missing page.

Just maybe the pages seemed somewhat "redundant" and they decided to show just one for that query result- the one they decided was more relevant - and filter out the one determined to be less relevant.

Does the particular page that's linking to the filtered one have the query phrase in the anchor text linking to the other page? That would make it an occurrence of "distinguished" text, whereas the linked_to page might have nothing "distinguishing" about it, in that sense of the word.

So it has to be something about the specific page and not the overall site.

There's been no indication or reports that this is hitting with sitewide *penalties*, it appears to be hitting individual pages.

[edited by: Marcia at 5:47 am (utc) on April 14, 2007]


 6:28 am on Apr 14, 2007 (gmt 0)

> There's been no indication or reports that this is hitting with sitewide *penalties*, it appears to be hitting individual pages.

Not sure I agree. My main domain has been switching in and out of this filter/penalty. When it does, all the pages in this domain get hit. When they come back, they all come back.

Now the home page goes 950, but for some of the other keywords the pages go say +30 or +50 or something (I haven't been tracking exactly). But all the pages in this domain having been getting hit (I think - I double check if/when they bounce out again - currently my pages have been back in for 2 days).

I have 4 other subdomains linked off of this main domain, and those haven't had any problems during these last 2-3 months my main domain has been getting hit.



 7:49 am on Apr 14, 2007 (gmt 0)

It does seem in a minority of the cases it effects 90%+ of pages on a domain. For the most part though it seems like a page penalty.


 11:02 am on Apr 14, 2007 (gmt 0)

I still think that IF it's almost like a sitewide penalty, the problematic phrases are used sitewide as well. Perhaps in the navigation. Or in the TITLE / boilerplates of the pages. But essentially all this is applied to individual pages.

There are a lot of words Google won't match up as being a legit addition to your theme. The higher you can climb with them, the harder you fall if their new reranking algo finds you unacceptable ( sometimes for no good reason ).


Recently I found an interesting and somewhat worrying way to test things.

Typed in the money phrase that the page is penalized for, but used a typo ( in the word that requires higher trust ). ie. california wiedgts.

Google not only deducted that I'm probably looking for the money phrase, highlighted the word as if I didn't make the typo, but also didn't even return ANY sites with the typo in it. Neither did it ask whether I'm actually looking for widgets instead of wiedgts. Results were more or less the same as if I made the query normally.


It forgot to apply the on the fly reranking to the SERPs.
The page was where I'd expect it to be without the penalty.

Not sure if it's gonna work today, but I had some fun with it yesterday.

[edited by: Miamacs at 11:03 am (utc) on April 14, 2007]


 12:22 pm on Apr 14, 2007 (gmt 0)

I still think that IF it's almost like a sitewide penalty, the problematic phrases are used sitewide as well

That's why I asked a ways back if one poster's site targeted a city or geographic region. Those sites (like mine) tend to repeat their geographic phrase at least once on every page and often numerous times on various pages. So, I was wondering if those type of sites were the ones getting hit on 90%+ of their pages.


 1:05 pm on Apr 14, 2007 (gmt 0)

[added to above]
Of course, that would also apply to any tightly-themed site.


 1:07 pm on Apr 14, 2007 (gmt 0)


Try putting a no follow tag on your recip links page and wait a week to see if it has an effect.

90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor.


 1:45 pm on Apr 14, 2007 (gmt 0)

"90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor."

You have prove of this do you?


 2:28 pm on Apr 14, 2007 (gmt 0)

either were involved in link exchanges or have recip links pages.

The difficulty I see with that conclusion is that so many sites with an online history of any decent timeframe have reciprocal links. Also, I know of one case where the -950 ranking fell away after adding a new link - and nothing else was changed or removed.

Establishing cause and effect (rather than just statistical correlation) is a tricky process. When it comes to reciprocal links, there are so very many sites that have them and are not hit by the -950, that noticing this as a common factor may well be correlation but not causation. There are just too many counter examples, in other words.

I'll bet most of the affected sites have a <body> element, too, if you see what I mean by that extreme example.


 4:24 pm on Apr 14, 2007 (gmt 0)

90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor.

Absolutely not the site I've been working with. Only a couple of outgoing links.


 6:08 pm on Apr 14, 2007 (gmt 0)

Try putting a no follow tag on your recip links page and wait a week to see if it has an effect.

Personally, I definitely would NOT do that. The last thing I'd do is try to artificially manipulate and upset the delicate balance of the linking profile for a site, particularly one that has always and is still for the most part doing well, for the following reasons:

1. You want to be very_careful about trying to increase the authority score of a site while decreasing the hub score of the site by manipulating the links. Source: GoogleGuy, I can find the post if you need me to.

2. Modifying existing links using either a link condom or Javascript is doing just that - artificially manipulating - for search engines, not users.

3. Temporary measures and alterations in the OBLs = link churn [webmasterworld.com]. If there's one word we can use to describe what Google looks for in sites it's stability. Not doing a lot of SEO tricks, not tweaking and tuning - stability.

4. According to some papers/patents, there is a ranking benefit for a site for on-topic OBLs. It's also been the experience of many people. Why lose that for no reason?

5. It's unethical and unfair to those with whom quality reciprocal links were exchanged, by agreement.

From yet another of the same endless arguments about reciprocal links that happens all over. Straight from the horse's mouth at the Google group:

Adam Lasnik:
Marcia's right: reciprocal links have been around forever, and Google
doesn't frown on engaging in reciprocal linking in moderation.

The key here is, indeed, moderation :). If, say, 90% of your backlinks
are reciprocal, that's probably not going to improve how our algorithms
view your site. Or worse, if 90% of your backlinks are reciprocal and
not likely to be of interest to your user.

But exchanging links here and there -- *especially* when done with
clear editorial judgement (e.g., you're not just accepting dozens of
link exchanges willy-nilly) -- that's not the sort of thing Google
looks down upon.

Hope that helps clear things up a bit!


But not improving how Google looks at the site is not the same as penalizing, is it? There's a big, big difference between things hurting and things just not helping.

90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor.

And I betcha 100% of them have URLs that start with http:// too!


 6:26 pm on Apr 14, 2007 (gmt 0)

I still think that IF it's almost like a sitewide penalty, the problematic phrases are used sitewide as well. Perhaps in the navigation. Or in the TITLE / boilerplates of the pages. But essentially all this is applied to individual pages.

Very interesting insight. I think you are on to something here.

The main domain of mine that got hit has been more tightly themed than the other subdomains that haven't had any problems. The target keyword phrase for the home page was in many of title tags and all of breadcrumb text links multiple times.

I spent a few hours yesterday updating the sight to make it more persuasive to a visitor and less "over-optimized". Since I had not done this sort of major overhaul since I built this domain a few years ago, I was a bit surprised at just how much I might have "overdone" it.

At the moment, the domain is not getting 950-filtered, and hasn't since Wednesday. But it has flipped in and out of this thing for the past few weeks.

About a 4-6 weeks ago I removed a subdomain-wide link-back to this main domain from a relatively new blog that I added last fall. The net effect was to remove 50-60+ linkbacks with the target phrase in it. A few weeks later, the domain started coming back for a few days at time, and then getting filtered again.

So maybe that move got it from being filtered "hard" to being on the edge of the filtered criteria.

We'll see what this latest move does over the next few weeks.

[edited by: egomaniac at 6:38 pm (utc) on April 14, 2007]


 6:32 pm on Apr 14, 2007 (gmt 0)

Of my two main subdomains that are fine (and have had no problems with the 950 filter), they do have keyword breadcrumb linkbacks to their subdomain home pages, and they do have *some* keywords in the title tags. However there is more variety in these keywords. The theme is less tight, the nature of the content is more "organic" in the sense that the content was originally written with newsletter readers in mind first, and then the content was repurposed for the web.

As for recip links. All 3 of my domains have recip link pages. Two domains are OK, and 1 is not. So while that might be a gating factor, it certainly is not a definitive causal factor.


 7:51 pm on Apr 14, 2007 (gmt 0)

could it be a adsense problem if you have sites in the same category and of those site you use adsense, one thing is for sure they also use the info from adsense.

One more thing when I make a search on google.de, only there, I see ranking not good but ranking, Im placed on page 7, not 95 as on google.com, yes the site is on a US server and yes its a .com domain.


 1:42 am on Apr 15, 2007 (gmt 0)

It's not directly an AdSense problem but since they are trying to stop spammy sites like MFAs if your site has a phrase pattern related to MFA sites that could be the problem.

That's not saying it's a spammy site as many are getting caught in this that are not.


 12:16 pm on Apr 15, 2007 (gmt 0)

I'll add this again, although I've repeated it many times before... could shed some light on how come sometimes a site uses the same amount of optimization, and gets away with it, while another subsection/domain is off the radar on every odd week.

I strongly believe that this whole phrase based reranking - as any other filter btw - targets only SOME phrases.

Those which are competitive or include a competitive word ( and have a high threshold for trustrank on the SERPs ).

This includes geographic locations, simply because they're universal for any combination, could be used to jump from one theme to another, thus are closely monitored. Also, I don't think Google would use two sets of thresholds to decide what's popular, important, or too profitable to let out of their hands. TrustRank is still a good parallel to whether a targeted phrase is monitored by the most strict filters or not.

Hence all sites that have been sent to the end of the queue had to have a relatively high trust to be in the primary index for such queries in the first place. Also, if you ever caught the fluctuations of the trust threshold, it's a fair call to think this would apply to the filters connected to them as well. Sites appear in the primary index on/off/on/off.


The monitored phrases these sites did well for were already matched to the inbound anchors, internal link anchors, page titles and occurrance in content but only to see which were relevant and not to see if they were "legit". It's still OK to have the keyword three times in the title, and we see that some only have it in the content+anchor combination. They don't care, as they only count the occurrance of any phrase ONCE in this case.

Now came along this new reranking algo and looked for OTHER competitive terms present on-page as well. Perhaps it killed the pages that linked to those that don't show anymore. Perhaps it's irked by the navigation anchors. You might have branded a name and used a title that's off or perhaps it's not even the phrase, but its co-occurrance with some distinct OTHER phrases ( or N number of such ).

I'm not guessing, this is the actual patent and what we see in practice. It's just I wouldn't know which one applies to the given site :

- {simple} If the pages did well for one topic ( my-city widgets ), and the site also optimised ( an area, page, subdomain, part on page, whatever ) for another that was "close enough" by previous standards ( buy widgets / othercity widgets / my-city hotels ), pages might now be filtered out for not having enough inbounds supporting the second ( third, fourth, eleventh ) theme on them.

...or worse yet:

- {complicated} If the site had all the inbounds, page had all the internal anchors it needed ( my-city widgets, my-city hotels ), the filter might have ...wrongfully... decided that a page about THIS theme shouldn't include THAT and THAT theme, or N number of "close enough" themes ( my-city widgets / hotels / toothpaste / apples / air conditioners ), because it's unnatural, or isn't that close anyway, or is used 51% of the time in this combo by spammers, and again, penalized the pages for whatever it deemed as spam. ( And let's use the word spam with some irony. )

...and finally:

- {impossible} Google has some weird association set in the brain of its AI for a combination they, you and users aren't even aware of. You have no chance of knowing what this is ( as in guessing from the thematic relevance and using reasoning ), and the only thing you can do is collect data of your pages that work, and those that don't, examine it as if the words were pretty pictures without a meaning, and find the one ( combination ) that the bot doesn't like.

ps. I'm kind of waiting for the "it's caused by Google sitemaps" argument, perhaps later someone would pop in and share the information.


 3:10 pm on Apr 15, 2007 (gmt 0)

Read the previous message by Miamac very carefully folks. It's that short summary of the problem that people have been asking for. At least it's a short as it could be.

I know many will be disappointed that there is not a simple solution but that's the way it goes sometimes.

The part I still can't get my brain wrapped around it the idea that they only count the phrase once. I guess because it seems so harsh. I know, I'm not being logical.


 3:14 pm on Apr 15, 2007 (gmt 0)

My site lasted back in the top 10 for my search phrases for a whole 5 days, now the entire site is gone. Now even my top search term is gone, which until today has never been touched. When I look at the phrases that my site was previously ranking well for, it is definitely a phrased based filter. For the sites currently in the top 10, the key phrase is found NO WHERE on these pages...each individual word is on the page but never together in the form of the search phrase. That is just weird and not very helpful for users. Many of the current pages are not what users would be looking for.


 3:55 pm on Apr 15, 2007 (gmt 0)

I'm seeing a blatant spam site in the top 10 for a competitive tech related keyword I'm tracking.

One of the sites has 50 links in the footer to other sites related to holidays abroad, pets, sneakers, camping equipment etc. Not only that but the front page reads like a site map with over 300 keyword rich internal links.

Another site has more stuffing than a thanksgiving turkey, keyword stuffing that is.

So keyword stuffing for these sites is not tripping the filter.


 5:20 pm on Apr 15, 2007 (gmt 0)


You’re right on the money. If it was "phrase based" you would find yourself from result 30-100 in the results. I told tedster a while back when we were hit with some "phrase based" result, and we did not slide to the -950. We slid down yes, but not -950.

Being -950 is screaming penalty because Google downgraded your page to the bottom. This is all about downgrading of links. Yes, there are sites still that have link exchanges on them that are up at the top and it just a matter of time until they find themselves hit with the penalty. Every week there are a few more people who all the sudden find themselves in the -950. Its a slow downgrade

Think about this, how many real good sites are out there that have "recip links" or "exchange links" pages on them? You do not see EBay, Wiki, Dell, Digg, Pepsi, Drudge Report, etc... Having exchange links or recip links pages do you?

Those kinds of pages show a signal of quality. Good sites that people like and go to often DO NOT have "recip link" or "exchange link" pages on them. They do not need to.

Having those types of pages hurts sites in the long run. If you build a real good page and people like it, they will naturally link to you. Notice, I said naturally! Not unnatural like webmaster asking other webmaster times 100.

Yes, its fine for a mom and pop to do this, but you have to ask yourself one question. Are you a mom or pop or a PROFESSIONAL? If you want to be a mom and pop thats fine, but do not expect Google to give you a ton of traffic.

[edited by: trinorthlighting at 5:48 pm (utc) on April 15, 2007]


 5:48 pm on Apr 15, 2007 (gmt 0)


haha..its your focus..links..perhaps a little to much....you say the big sites dont have them but thats not strictly true...theres big and theres $%^&ing huge..your naming the $%^&ing huge sites and im not sure they make good examples for the rest of the serps..go to the next level down...the not quite $%^&ing huge and you will find they are buying links rather than swapping them..they also buy networks and point those at their main site..nothing natural about that so i would not agree that not having recips is a sign of quality..often its a sign of a big budget....now if you made the decision to favour big budget sites over recip sites you would certainly clean out a lot of spam and along with it large collateral.......but it would not have anything to do with natural link patterns...


 5:50 pm on Apr 15, 2007 (gmt 0)

There is a difference between advertising links which should carry 0 page rank and 0 trust rank verses the links that do carry weight.

I can also name sites who do not have those types of budgets as well. Look at all the blogs out there that are real popular and have 0 in advertising budget. You do not see exchange links or recip links on their sites either.


 6:11 pm on Apr 15, 2007 (gmt 0)

trinorthlighting you make some good points, however I'm seeing huge blog networks thriving with massive reciprocal link activity. Think networks like Weblogs inc etc where there is massive interlinking between sites on their network. They get far more visitors than Pepsi's home page :)


 6:34 pm on Apr 15, 2007 (gmt 0)

They will get caught foxtunes, just a matter of time, then you will see them come to webmaster world and post that they are a new victim in the -950

Want to run a good experiment like we did a month ago? Report those sites as spam and wait a few weeks and check on them. Guess what you will see -950 or total ban from the index.

That is why I keep saying its about links.

This 194 message thread spans 7 pages: 194 ( [1] 2 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved