Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Keyword specific 1 page drop penalty after human review



11:49 pm on Sep 14, 2008 (gmt 0)

5+ Year Member

One of Google's latest penalties hasn't gotten much attention. It causes a relatively small 10 position drop to a website and it is most likely keyword or phrase specific. Your site can rank for "keyword1 keyword2" but drop in rankings for "keyword1". Or you can still have a good ranking for "keyword3" but drop to second page for "keyword4".

I have quite a few clean sites affected already. In all cases the sites have been manually checked by Google's web quality team members (in either Mountain View or Dublin) or human reviewers before the drop. A page gets checked and a few days later it drops from page 1 to page 2 for a given keyword or phrase. Nothing you do will get you back to page 1. I've tried various things for several months like adding a new pages, links etc. Sometimes you can see your site back on page 1 but it's only a brief moment, a few hours max when G is switching filters on and off - usually during night time. I've seen my site get up to #4 only to drop back to #14 soon after. During Google Dance I can see another site "dance" between #3 and #13 - and end up #13.

Affected sites are clean even by Google's standards. If they were breaking guidelines then the manual check would have resulted in a much stricter 30/60/950 penalty. However, I'd still call this quite a severe penalty as dropping from page 1 to page 2 for certain keywords takes away most of the natural traffic.

Many webmasters think a one page drop is an algo update and they need more links or onsite SEO to get back. Well, after almost one year of sitting on page #2 for a non-competitive "keyword1+keyword2" I'm pretty sure it's a penalty. I'm stuck at #11 yet the same subpage ranks ok for "keyword2+keyword3". I tested doing a 301 to a new subpage and interestingly Google's human editors were there within days checking the new page through their "rating task" system.

Because it is likely a manual filter/penalty I need to send a reconsideration request to Google. I can only guess the reason for the penalty. Affiliate content or unnatural backlinks are my best guesses. Bottom line: it is a very frustrating penalty which I'm sure many webmasters have without them knowing about it. This penalty can be hard to spot because you can continue to rank high for your main keywords. As stated in the title, it is a keyword specific penalty.

I'd like to know if anyone here have seen something similar. What has happened to your rankings after being checked by human reviewers? Do you feel like you're stuck on page 2 and can't get to page 1, not even #10 where you have been before?


2:34 am on Sep 15, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

This brings to mind the phenomena we discussed in another thread: Position 4 Gets Strange [webmasterworld.com]. Particularly because you mentioned a cycle of "flashing"; between #4 and #14, or #3 and #13. Position #4 is particularly sensitive, since it's one of sposts where a universal search result can be forced (excuse me, blended) into the SERP.

That earlier discussion missed the human review component that you mention seeing, however.

In the case of very competitive phrases, the editorial squad does evaluate the first page SERP - and if there's agreement across several independent evaluations that a given url is ranking unnaturally high, it can get demoted. The flat value of "minus 10" would be a new twist here, however.

The "unnatural backlinks" observation would line up with the earlier discussion, to a degree at least. I know of several sites that intentionally went after ranking on a competitive phrase and all of a sudden, they were stuck on page two when it seemed that, without human intervention at least, they "should be" on page two.


4:18 am on Sep 15, 2008 (gmt 0)

5+ Year Member

we were at the top for a hear now, recently dropped to 4/5 ..... strange to me


12:20 pm on Sep 15, 2008 (gmt 0)

5+ Year Member

b2net i have been penalized for a specific keyword phrases also that are quite competive, my site is highly relative for those keywords and the keywords are in the domain name. I did a lot of link building using the penalized phrase before, my suspicion was that because a large percentage of my backlinks contained those phrases then that was probably why I was 950'd because I'm nowhere to be seen otherwise it's a manual penalty by a human reviewer but then why 950 me when my site is highly relavent, so I'm leaning towards an automated penalty, does that make sense?

Martin Ice Web

1:26 pm on Sep 16, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

b2net and tedster,

why should google penalize a site with 10+ pos. by human reviewing if its clean and by its relevant content should rank an first page?
That does not make sense! Cause google wants to offer the most relevant results?


5:39 pm on Sep 16, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

The human evaluators might feel that other URLs make better offerings to the end user - especially giving consideration to whatever Google has learned about user intention for that paraticular query.


12:31 am on Oct 1, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

Today I've lost one fairly major term that I have had for almost 2 years... Still a lot of other terms remain and long tail variations of this term continue to drive traffic. Very strange and I'm hoping it's a glitch as my site is very relevant.. Oh yeah and the site that is now number 1 is a pile of fraudulent trash...


4:40 am on Oct 1, 2008 (gmt 0)

10+ Year Member


It is not strange that G has this new algo. For a nice webmaster, you should be ready for any listing in serp. Who would advertise adward in G if your listings are all the time number one or two ? The idea is so simple even if your site is very clean in google standard.


4:53 am on Oct 1, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Let's focus back on the topic that the original post set out - a one page drop after a human review. There's is too much complexity in todays' Google to mix in anything different.

The only way to know if your 1-page drop falls into this category is to watch your server logs for Google IP addresses that aren't googlebot. If you're not watching your server logs and you are strugging with ranking problems, this might be a good step to add to your toolkit.


9:37 am on Oct 1, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

Mine came back after a few hours - glitch.


9:11 pm on Oct 1, 2008 (gmt 0)

5+ Year Member

This is probably a dumb question, but how do you know if a manual reviewer has visited your site? Do you just look for referrers that come from the google domain with a URL that includes "rating task" in it?


9:51 pm on Oct 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Martin Ice Web, you've been deceived into thinking that the Google machine algorithm represents "relevance." It doesn't. It's simply a mathematical function that (in the Googletechs' opinion) often represents an approximation of relevance.

The page [vuw.ac.nz...] contains one independent observer's list of things that should matter when evaluating the "quality" of a website. I would specifically emphasize "Scope", since it is an often-overlooked but significant criterion in Google's list, and "Content", since that covers various issues which a computer can't possibly address -- but which human reviewers do. The computer doesn't know the difference between the American Cancer Society and the various boiler-room fraud fundraisers with names of the form adverb-diseaseName-collectiveNoun. The computer doesn't know the difference between Joe Blow (plagiarist extraordinaire and serial doorway-page spammer) and professor Joseph Blough (Nobel Prize winner and global philanthropist). But people know people who know people who DO know the difference. The computer can't spot the difference between Eliza'd plagiarized paragraphs and genuine coherent thought (well, some people can't either, but there's always a Verity Stob who can.) Well, these are the extreme cases: most of us fall between them, but when human reviewers get involved, we all are subject to those kinds of judgments.

Which are far more relevant than the results of any computer algorithm, no matter how complex.


Featured Threads

Hot Threads This Week

Hot Threads This Month