Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google's 950 Penalty - Part 10

         

netmeg

8:26 am on May 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



< Continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I have two urls that were 950'd on Friday. Unfortunately they are my two most requested urls (other than the home page) - they'd been #1 in the SERPS for at least three years, and now only show up on the last page for the most common search phrase. They both showed PR previously, but TBPR has been greyed out since the last update. However, in certain permutations of the search phrase, they they still rank #1. The search string usually comprises the city name and the event, and often includes the year.

Example:
city event - 950'd
city event 2007 - 950'd
event city - #1
city state event - #1
city state event 2007 - #1

As far as I can tell, it is ONLY two urls, out of around 500, that fell into this (so far, anyway)

What it all means, I have no idea.

[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]

Miamacs

12:54 am on Jul 6, 2007 (gmt 0)

10+ Year Member



I'm doing a lot of research, with domains specifically assigned only to test this... and it's a lot of fun.

May I ask whether source pages you got the links from were relevant to the anchor text? Whether they were relevant to your site? If these sources are actually trusted yet they do NOT rank for these phrases? I mean the ones you linked from... do they come up for the phrases you linked with? If not, try adding an extra word to the search query, something generic enough. What did you see? Were they there? Were they also -950?

...

Do you know which one of the inbounds ( older or newer, doesn't matter but probably one of the older ones ) brings in the most trust? Isn't it off topic, compared to your new "link campaign"? Or the site?

webastronaut

2:32 am on Jul 6, 2007 (gmt 0)

10+ Year Member



"The most popular keyword (phrase) in my sector puts me at -950 while adding any 3rd word related puts me back at number 1."

I had this problem with many domains but I'm back...

"The 950 Google problem can only be solved by keeping up with your site correct coding or even if you think your domain has too many keywords or whatever problem." < wrong answer!

I've had sites pop out no problem even when I know they have way too many on-page keywords and problems, just keep your site alive and interesting will bring you back without having to deal with all the correct Google rules... That's what I do? And it works!

webastronaut

2:35 am on Jul 6, 2007 (gmt 0)

10+ Year Member



... and Web Law rules n such...

dibbern2

3:40 am on Jul 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I had another directory bounce back from 950 today. Removed all internal crosslinks except one (to the top page *index*) last Friday.

This makes five sites or directories back over the last 3 weeks. Each one received the same treatment. Those I haven't touched are still in 950 land.

For me, it was either the internal cross links, or the keywords I had in those links. I believe this is just one facet, however, of a problem that can be caused by several different factors.

outland88

3:57 am on Jul 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I've had sites pop out no problem even when I know they have way too many on-page keywords and problems, just keep your site alive and interesting will bring you back without having to deal with all the correct Google rules<

From what I've seen I'm under the impression that just adding any content breaks the penalty. The thing is though if you're going to make changes make them count.

Marcia

4:14 am on Jul 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As I said, this is only my suspicion. But I do see the same kind of phenomenon that outland88 is describing. And I've also helped pages improve their rankings by losing the intense SEO focus on targeted keywords in the copywriting, and allowing the content to breathe a bit more naturally.

The copywriting (bordering on keyword stuffing) and I'm seeing issues in navigation and internal links.

I haven't had to work on any -950 pages with this approach so far (knock on wood here).

I've had a brush with a couple, plus one that went -1000 right out of the ballpark it was so bad.

But I have helped pages jump +40 places or so by giving them this kind of attention, and that makes me suspect that co-occurrence is a factor in play in the algorithm today. In linguistic and semantic study -- as well as in IR or information retrieval, the granddaddy of search engine technology. Co-occurrence is not exactly cutting edge. It's been knocking around for quite a few years.

That's exactly the approach I'm taking this very week with yet another site that hasn't gone +950, but does have a penalty (+80) that definitely appears to be phrase-based and demonstrates the same on-page and site-wide characteristics of a lot of the sites I've looked over that are hit with +950.

I've been suspecting co-occurrence/semantic factors, which is why I've been delving into those papers so heavily.

[webmasterworld.com...]

This week I've been poring over pages in the top ten and last 20 or so making comparisons and crunching numbers. Some of those sites, in a particular niche, sat in the top 5 for a long time and are now down the tubes. With some sites, it appears to be a matter of excessive repetition of the "main" words or keyphrase(s) across way too much of the site, with far too much emphasis on it (or them) - to the point of seriously unnatural.

Grab hold of that Firefox extension that you can check occurrences and densities with - it's a time saver and an eye opener. You can almost tell right about what the breaking point is, you see it so often with 950'd sites.

Added:
On a more specific note, in checking KW densities and occurrences within just the links on pages of the *main* keyword, which is used as part of the site's phrases, a high number of occurrences (in links) that put it over the 12% figure - say 15% or 16% has been VERY often seen in pages that dropped like a rock from where they used to be.

[edited by: Marcia at 4:22 am (utc) on July 6, 2007]

errorsamac

2:50 pm on Jul 6, 2007 (gmt 0)

10+ Year Member



tedster - Regarding your comment on the overall trust metric for the entire backlink profile - I think this might be a factor. All of the sites (except one) did not have quality backlinks. One site did have excellent backlinks from major news websites. The site received those links and other natural links. A few months after that, unnatural backlinks started to pop up and the site went -950, eventually recovered, and then a few new links came up (less than 10) and the site went to -950 again.

Miamacs - The source pages for my latest example were not irrelevant, but at the same time they were not "widget" sites (there are only a few in my niche, so these links were from sites that are interested in widgets, if that makes sense)? They were not paid text links. If I search for a generic query for any of my sites that are seeing -950, the page I am searching for is still penalized.

For example, every page on example.com is seeing the -950 penalty. If the site is all about widgets, and I have the sentence, "blue widgets are great to use in your home or on the go", if you search for "great to use in your home or on the go", the page that has that phrase is at the end of the SERPs (it's not a targeted phrase of course, but being in position -950 makes me believe that there is something overall wrong with the domain and it's not related to specific keywords).

I have multiple -950 examples and I might be confusing some sites with others.

[edited by: tedster at 3:01 pm (utc) on July 6, 2007]

errorsamac

2:59 pm on Jul 6, 2007 (gmt 0)

10+ Year Member



Marcia - Are you saying that if example.com is all about widgets, if you have links off of www.example.com/ that point to "Blue Widgets", "Green Widgets", "Red Widgets", etc. that the whole site is penalized?

In my latest example, I am targeted the keywords "Widgets" and "Blue Widgets". All internal links on the site do NOT have the word "widget" or any similar word in them (the internal anchor text is the brand name of the widget, not the Widget itself).

I have "Widgets and Blue Widgets" as the title, once in the meta description, and the "widget" keyword 5 times (6% keyword density) in the body of the page. I removed 2 instances of the "widget" keyword so now I have it listed 3 times total (down to a 4.9% keyword density). I'm not sure how much lower I can go on keyword density - Too low and Google won't even rank the pages anymore and I'd kill my rankings on yahoo/msn.

jk3210

8:09 pm on Jul 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay, non-believers, here's your quiz for the day.

Which item below best describes why Google would slap a 950 penalty on its own "http*//books.google.com/" pages...

1) Violation of Webmaster Guidelines
2) Over-Optimization
3) Lost Trust

econman

9:53 pm on Jul 6, 2007 (gmt 0)

10+ Year Member



We have a site which has been at #1 for its site name, and in fairly good position for some other keywords. A couple of months ago traffic suddenly tanked; turns out it was gone from the visible parts of the Google SERPS and has largely remained in that position for the past couple of months, except for a brief period some weeks ago when google referrals briefly picked up before dieing again.

I haven't studied these -950 threads, and we haven't done anything to attempt to fix the problem (other priorities). But, I was aware of the -950 issue, and confirmed that this site has been sitting near the bottom of the last page for its site name, and some other keywords where it had previously been on the first page.

Today I noticed traffic had increased sharply, so I spot checked a few SERPs, and sure enough it is (at least momentarily) back stronger than ever (e.g. both #1 and #2 indented for the site name).

If we had just finished a bunch of work implementing the suggestions in this thread, by reducing keyword density, or making some other attempt to "fix" the problem, I would have gotten the impression our hard work had proven to be a miracle cure. But, of course that would have been the wrong conclusion to reach.

My current hypothesis is that the site must be sitting right on the border with respect to whatever condition or conditions triggers this -950 penalty, and for some reason it must be fluctuating back and forth across that dividing line -- thus the bizarre pattern in which one day it considered the #1 most relevant site and the next day it belongs in position #992.

What's odd, however, is that we haven't changed anything on the site -- so why would its status fluctuate back and forth relative to a filtering criteria? Very odd.

errorsamac

9:53 pm on Jul 6, 2007 (gmt 0)

10+ Year Member



That URL is still coming up for me. What did you search for? I just searched for "books" and it was # 1.

trakkerguy

10:02 pm on Jul 6, 2007 (gmt 0)

10+ Year Member



jk3210 - what's a good search to see their books site at -950?

econman - The penalty behavior is odd, for sure. Thanks for posting your example. Is a good reminder that just because someone makes a change, and their rankings come back, it doesn't prove cause and effect.

jk3210

10:53 pm on Jul 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think they want us posting exact search terms here, but the search I stumbled on to was a geographic location, and the pages from /books.google.com/ that were 950'd were from very popular guide books on the search term topic.

Sixteen /books.google.com/ pages were listed between #900-999.

steveb

11:01 pm on Jul 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"so why would its status fluctuate back and forth relative to a filtering criteria"

It's important to remember that this is not a filter but a specific penalty. Some people are now trying to attribute "my ranking isn't good" with the 950 penalty. That won't help them understand their problem, because this ain't it. There is no "partial" or "gradual" 950 penalty. It is a steep cliff... #1 versus #922 is a good example. if you are on the edge of this penalty, you have a foot on the peak and a foot in the abyss.

neo schmeichel

1:12 am on Jul 7, 2007 (gmt 0)

10+ Year Member



"What's odd, however, is that we haven't changed anything on the site -- so why would its status fluctuate back and forth relative to a filtering criteria?"

If our sites (I'm the lucky admin of one of the 950s myself) are really sitting on such a razor thin line related to phrase based algorithms, then I think the "Detecting spam documents in a phrase based IR system" may have some insight into econman's question. Basically, the metrics used in the system's cooccurence algorithms are based on normative ranges within the set of documents that are already deemed relevant for a phrase query. If either there's a shift in how people are using phrases in their documents, or in how relevancy is originally determined (thus changing the set of documents determining what is normal), the next iteration of the phrase index would change the threshold for what is "normal" cooccurence, and certain documents will fall into and out of that range without having changed at all.

I think it's important to note here that causation can come from two different directions: from a cultural/linguistic shift in a vertical, and from an algorithmic shift at the search engine (which may or may not have anything to do with phrase based indexing in particular). Sometimes changes we make ourselves can hurt or help us, but sometimes we can be hurt or helped by changes made to other sites and indirectly related algorithms.

bobsc

5:24 am on Jul 7, 2007 (gmt 0)

10+ Year Member



It's important to remember that this is not a filter but a specific penalty.

Did you or anyone you know receive a penalty notification from Google?

errorsamac

2:00 am on Jul 12, 2007 (gmt 0)

10+ Year Member



I'm not sure if Google turned some more knobs, but three of my sites came out of the penalty. Here is what I did:

Site 1: I was an idiot and re-did my internal linking structure which gave me the -950 penalty (I used "blue widgets" way too many times now that I look at it). Anyway, I undid all changes (just reverted to an old copy of the site) and the penalty was removed in 10 days.

Site 2: This site received less than 10 external links and it caused the penalty. The site has been static for months so I believe the links caused this penalty. I did nothing to the site and it escaped exactly two weeks later (14 days).

Site 3: This site has been in 950 land for months. I removed a duplicate set of links in the middle of the page (a "blue widgets" link was at the top, and a "blue widgets" link was in the middle of the page) and I also made it so all pages off of the main page did not link anywhere else except back to the main page. I did not change any text on the site, I just removed internal links. I made this change within the past 3 days and I am really surprised the site jumped back so quick. It makes me think this was knob turning more than actual fixes on my end.

Did anyone else escape?

sahm

4:42 am on Jul 12, 2007 (gmt 0)

10+ Year Member



This was definite "knob turning"...parts of my site have been fluctuating in and out of the 950 penalty since at least March (with no major changes on my end), and as of 7/8, my entire site is unaffected by the penalty...with 4x the traffic. I haven't seen this much traffic from Google since last year. I know it won't last, but I'll enjoy it while it does :)

oender

8:27 am on Jul 12, 2007 (gmt 0)

10+ Year Member



yes my site escape too
my site has been in 950 land for two weeks

Here is what I did:
think about to change the discription
think about to change the navigation
think about to change linkstructure
i want to change onpage (deoptimization)
but i changed nothing
and the site came back today
thank you google
Nevertheless i will change a lot i promise

matt900

3:22 pm on Jul 12, 2007 (gmt 0)

10+ Year Member



One of my sites just got out of the penalty and the pages went back to their original positions. The site dropped on 6/27 and came out on 7/9. On my site, all the article pages went to the last page and only the home page was still showing up on top.

I made some changes during that time: updated my robots.txt file, added nofollow to some internal links, and did a little bit of de-optimizing...nothing too major, but most of the changes made sense. Actually, it made me go through and just clean up a few things that I had overlooked...or put off.

Not sure if it helped, or it was all just a coincidence...anyway, I won't touch anything for a while, just in case :)

mirrornl

5:21 pm on Jul 12, 2007 (gmt 0)

10+ Year Member



definite "knob turning" indeed

my whole site went 950,
first time for this 7 year old one
didn't change anything in the last months

neo schmeichel

5:17 am on Jul 13, 2007 (gmt 0)

10+ Year Member



Looks like Google's splitting the difference between Bastille Day and Independence Day -- my site's out too. Actions taken include:

1. Fret and worry and have meetings about how to fix it.
2. Make minor changes that likely had only a small impact on Google's perception of the site -- mostly switching to meta robots (noindex, follow) rather than rel=nofollow for some of the less interesting pages on the site. Also integrated some product reviews onto the product pages (lightbulb) rather than separate reviews pages. This latter change definitely had nothing whatsoever with our escape to sweet liberty, as our 950 problem was definitely hitting landing pages rather than product pages.
3. Consolidated some landing pages...in staging. If our staging servers are getting crawled, we're definitely going to have to get more aggressive with robots.txt. :)

We're still going to move forward with a complete redesign of the site (upgrade the aesthetics, functionality, and html, with little change to the core content), but probably aren't going to risk rewriting our ugly URLs anymore.

And cheers to anyone else who's out -- hopefully it's not just work release.

JoeSinkwitz

1:36 pm on Jul 13, 2007 (gmt 0)

10+ Year Member



I have reason to believe that the mathematics regarding the EOS (end of serps) re-ranking have changed once again; no longer are all sites that were previously bouncing off the cliff on the re-rank are going to the EOS -- now some are going MOS (middle of serps).

You can take what I see with a grain of salt, but I watch a lot of different queries for a bunch of very different sites, and it is something I've noticed increase since the 11th. What am I talking about?

Let's take a site about widgets...it had been ranking for 4-5 years on the first page without any issues. In the past Spring, along with a bunch of other quality sites, it'd find ranking oscillating between the first page and the final page of the results set, to the utter confusion of the webmaster. Along comes July, and while the webmaster is getting used to the violent swings and refusing to modify a site that is made for users, he finds it bouncing instead from page 1 to page 32. #*$!?

We've been saying for a while that there appears to be multiple reasons why a site goes EOS, but only a few things that seem to get it out (attempt to remove any colliding themes via text and nav changes AND/OR improve inbound quality links for the phrasing in question). This middle-of-serps re-rank could have to do with Google attempting to separate out those sites getting bumped as a side-effect, as a round about way to grade its filtering on each specific reason without sacrificing what it may deem an acceptable quality mix for page 1.

IMHO, so long as Google struggles with false authority sites and issues such as proxy hijacking, this re-ranking will continue to wreak havoc, because it is simply not setup correctly to bump the type of sites it should be bumping.

Cygnus

trakkerguy

4:47 pm on Jul 13, 2007 (gmt 0)

10+ Year Member



sites that were previously bouncing off the cliff on the re-rank are going to the EOS -- now some are going MOS (middle of serps).

Glad you posted about this, Cygnus. I've definately seen this and been trying to interpret what it means.

A long suffering site I've been working on exhibited this MOS behavior earlier in the spring, and I thought it was a good sign of coming out of EOS penalty. Then it went back to EOS.

Last week it did change, and it is again in the MOS for many searches, EOS for a few, and ranking #1 for some, mostly non-competitive terms.

Don't know if it is a sign of improved trust, or just Google trying out a new tool. I like your theory that it is a way for them to separate out sites to fine-tune their filter/penalty.

MrStitch

6:33 pm on Jul 13, 2007 (gmt 0)

10+ Year Member



I backed off on the on-page SEO of my site, and submitted a re-inclusion request.

I'm hoping something will happen (ANYTHING at this point would be good news)

I still have a couple of internal pages ranking, which is ok I guess. A little is better than nothing. ;)

Robert Charlton

6:34 pm on Jul 13, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



...now some are going MOS (middle of serps)...

I'd noticed this too. I don't watch these that closely or that often, but in the past few weeks I've seen a move up and then a jump back down. Haven't checked very recently.

Timetraveler

7:52 am on Jul 15, 2007 (gmt 0)

10+ Year Member



Did anyone who rebounded and got out of the EOS ranking this week go right back to the EOS today?

mirrornl

9:33 am on Jul 15, 2007 (gmt 0)

10+ Year Member



this site is back after 3 days -950,
i'm soo happy
fingers still crossed

Robert Charlton

6:13 pm on Jul 15, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I'm seeing some sharp moves up for previously 950ed pages, as well as a bunch of other changes. As I describe in the July SERP changes thread, it does appear that Google is working on fixing the collateral damage problem.

b2net

7:27 pm on Jul 15, 2007 (gmt 0)

10+ Year Member



It's not going in the right direction. More sites dropped to -950 this morning. And on one site individual pages drop while others remain. I can get back up by renaming the filename and doing some modifications but it doesn't make sense. Why would Google give such a harsh penalty to normal pages?
This 155 message thread spans 6 pages: 155