Welcome to WebmasterWorld Guest from 54.196.175.173

Message Too Old, No Replies

Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?

     

tedster

8:58 am on Feb 8, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

Something is now becoming clear, and I think it's time we put away the name "950 Penalty". The first people to notice this were the heaviest hit, and 950 was a descriptive name for those instances.

But thanks to the community here, the many examples shared in our monthly "Google SERP Changes" threads and in the "950 Penalty" threads themelves, we now can see a clearer pattern. The demotion can be by almost any amount, small or large -- or it even might mean removal from the SERP altogether.

It's not exactly an "OOP" and it's not the "End of Results" penalty. From the examples I've seen, it's definitely not an "MSSA Penalty" -- as humorous as that idea is. (Please use Google to find that acronym's definition.)

It's also not just a Local Rank pheomenon, although there are defiitely some similarities. What it seems to be is some kind of "Phrase-Based Reranking" - possibly related (we're still probing here) to the Spam Detection Patent [webmasterworld.com] invented by Googler Anna Lynn Patterson.

So let's continue scrutinzing this new critter - we may not yet have it nailed, but I'm pretty sure we're closer. The discussion continues:

[edited by: tedster at 9:18 pm (utc) on Feb. 27, 2008]

JoeSinkwitz

5:20 pm on Feb 11, 2007 (gmt 0)

5+ Year Member



There's a lot to be said about the scraper theory. I manage a fairly significant number of sites for my company and our partners (mostly in hypercompetitive industries); as such I have the misfortune of encountering a lot of the funky penalties that crop up.

I've been lurking on the 950 penalty threads because I wasn't entirely sure what was going on, given that multiple factors do seem to be in play, but I have found some commonality with my own experiences.

1. Phrase-based penalties & URL-based penalties; I'm seeing both.
2. On phrase-based penalties, I can look at the allinanchor: for the that KW phrase, find several *.blogspot.com sites, run a copyscape on the site with the phrase-based penalty, and will see these same *.blogspot.com sites listed...scraping my and some of my competitors' content.
3. On URL-based penalties allinanchor: is useless because it seems to practically dump the entire site down to the dregs of the SERPs. Copyscape will still show a large amount of *.blogspot.com scraping though.

Getting rid of scrapers is a thousand page thread in and of itself, but what I've been doing so far is a mixture of modifying titles, slightly modifying on-page text, getting some new links that match the new title, and where possible, turning in the *.blogspot.com junk as spam on both the blogger and G spam report side.

Normally scrapers wouldn't be a huge problem, but with Google continually tweaking their authority knob, those *.blogspot.com are becoming instant authorities, which is really, really bad. That has to stop as of last year. I don't have an answer as to why sometimes the penalty is phrase-based and why it is sometimes URL based, but I can say that I've seen them alternate on the same domain, I've seen just the phrase-based issue occur and resolve itself, and I've seen the URL-based issue occur and resolve itself.

Confusing isn't it?

So that's my vote...false authority scrapers that are causing temporary filtering as Google attempts to determine which is the more valid source, rectified by modification of both on-page and off-page tactics.

Cygnus

Martin40

5:44 pm on Feb 11, 2007 (gmt 0)

10+ Year Member



Thinking that PR is the be-all and end-all is very 2002 thinking - this is 2007.

PR is still an important anti-spam tool. If a page that has been around for a while has no links at all, it's most probably spam.

This seems to be unrelated to PR. In fact hundreds of pages with less and even 0 PR are ahead of my missing pages.

I wouldn't expect Google to make these changes visible in the toolbar, they are most likely only temporary.

If the toolbar shows PR0, then a new page may, and usually does have some PR.

Whether or not PR is the tool that Google uses to send sites to the background isn't even important. What matters is the question why better-content, non-spamming sites are temporarily deranked.

TheWhippinpost

5:50 pm on Feb 11, 2007 (gmt 0)

10+ Year Member



Now, how could a pic, added on the 10th of Feb be part of the cache of page done on the 7th of Feb.

Because it's linked to your server?

MHes

5:58 pm on Feb 11, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>it's a live download of your page and popped into the browser

No.... look again, especially at cache dates.

>Whether or not PR is the tool that Google uses to send sites to the background isn't even important. What matters is the question why better-content, non-spamming sites are temporarily deranked.

Welcome to 2007. This is way more than a PR issue.

[edited by: MHes at 6:03 pm (utc) on Feb. 11, 2007]

jk3210

6:11 pm on Feb 11, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One of my pages is in the number 1 position with Site Links below it. Two of the pages (urls) occupying Site Link position 2 and 3 are ALSO repeated in the 900's. Same urls. Wouldn't that indicate that the 950 Syndrome is a separate process from the normal algo?

How else could Google rank the same url in positions 2 and 943 at the same time?

tedster

8:26 pm on Feb 11, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I think that may happen because the extra Site Links are determined by a separate process. Then they are "attached" to the domain root in the number one position as extended information for the user. In other words, those urls do not actually rank on the first page of the SERP according to the algo. They sort of get carried there on the coat tails of the domain root.

Martin40

9:46 pm on Feb 11, 2007 (gmt 0)

10+ Year Member



Basically, the Google cache is not a cache guys, it's a live download of your page and popped into the browser.

No, it's not. The images are live downloaded from your own server, but the HTML code is cached.

[edited by: tedster at 8:05 am (utc) on Feb. 12, 2007]

steveb

11:02 pm on Feb 11, 2007 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The penalty hits authority sites because it is all about the scoring of the page. Again, you can't understand the penalty unless you look at the group of sites there, not just "my site". Besides powerful niche authority sites capable of scoring highly with many pages, there normally are at least a couple dozen of those hacked/redirect puke "pages" listed. These are *extremely* high scoring pages, with tons of randomized anchor text, links from unique domains (meaning blog comment pages), randomized keyword text, and so on.

It may happen occasionally, but I've never seen a page that ranks #50 (for the poison word search) hit with this penalty (unless it was in a directory beneath a penalized page). High scoring pages are at risk, this means authority sites mistakenly get hit along with the spam sites beign targeted.

annej

5:27 am on Feb 12, 2007 (gmt 0)

WebmasterWorld Senior Member annej is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I got two article pages back today! I took the key phrase for each formerly missing page out of the page title, H1 tags, and any internal links. I also decreased the use of the phrase in the article text.

My missing subdirectory and a few other pages are still gone. Still working on them.

Just reporting.

MHes

8:00 am on Feb 12, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



annej and Steveb - I agree with your observations. We may have found (probably stumbled upon) a fix but it is way to early to know if it will stick. What gives me hope is that over the last six weeks we have made changes that have had zero effect and we just continued to pop in and out on a four day cycle with old rankings coming back as if we had changed nothing. Having undone all of that we now have a means to at least make a difference. We are showing different pages for searches but ranking position 6 or 7 when before we were ranking 1,2 or 3. These include hundreds of search phrases from very competitive to longtail.

I think getting a new cache of your changes is obviously important, but also there is an offline analysis which needs to be updated as well. The big problem is knowing what data is being used offline and are they now using old data with previous 'fixes' that you have since abandoned!

annej - Are you ranking as high as before.... despite your changes. Also, are you getting the same indented page combinations?

This 175 message thread spans 18 pages: 175
 

Featured Threads

Hot Threads This Week

Hot Threads This Month