Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google penalty observation - out of -950 into -50?

         

SEOPTI

5:20 pm on Dec 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My whole site was in the -950 box for about 4 months, after deoptimizing internal anchor text it ranks again on postion -50 for almost all queries.

I'm quite familiar with the -950 stuff and have been able to recover sites by deoptimizing internal anchor text but the -50 is still a mystery for me.

I think the most interesting part is, the site went from a -950 to a -50.

bwnbwn

9:30 pm on Dec 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



SEOPTI I had the exact same thing happen to me about a 2 years ago. I bet you had a couple searches you still held and the domain name search brought your site up as it should.

I to was 950'd and made a ton of changes but all I could get to was 50th 30th and in between. Early this year Matt made the comment they were releasing some filters and I was out and have been slowly building back to the pre 950 levels. I was filtered for almost a year and been out now for almost a year.

10-1 your site is under a filter not a ban but a filter. Don't do anything stupid but continue to tweak and make improvements to the nav, internal linking and user experience. Continue trying to get some good links from within your industry and in about 4-6 months hopefull they will do a release and you will get out from under it.

I bet it is a manual review that got it filtered and one that will get it released.

Do a search using the whole title of the front page on your site minus the domain name and see were your rank is, then add the domain name and see were your rank is.

Then try an interior page title same way and see were your at.

Let me now were you rank for both.

SEOPTI

1:59 am on Dec 12, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



bwnbwn, thank you for this helpful information. It's very interesting to see you believe it's a filter, I also think it's some sort of filter where PR has been devalued.

This -50 filter hits sites with unique content, most of my sites have a really healthy supplemental ratio. So I can rule out it's about having unique content or not.

I think it's probably a protection system where a site which has been in the penalty box for a few months (-950) will not start ranking again immediately but will slowly climb its way to the top. But of course it's just speculation.

As always it's a pain ... you give them unique content and they hit your site with this kind of nonsense.

tedster

5:03 am on Dec 12, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



a site which has been in the penalty box for a few months (-950) will not start ranking again immediately but will slowly climb its way to the top. But of course it's just speculation.

I can confirm that there is an automated "gradual recovery" treatment from Google. Some penalties just pop-out into full restoration, but others recover stepwise.

fearlessrick

2:52 pm on Dec 12, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Feel I should comment here. I have written three times to G about inclusion, reinclusion, penalties (My PR went from 5 to 4 to 0 and now back to 2 for my main stie). One issue was my content being used by a very large retailer named after a river in South America and G thinking I had copied them. I have endeavored to straighten out both companies, as I have extensive proof that my content is original. I did not file DMCA's, though I still might, as that large retailer tried to brush me off without doing anything.

In any case, my traffic (and AS revenue) has been steadily improving since September, and just last night it began to really explode. As of this writing, at 9:00 am ET my traffic is already 70% of a normal full day and revenue is already at 90%. Don't know whether it's a click attack (I somewhat doubt it because the pageviews are so high and it's been over many hours, though it could be any number of scrapers), but I really think that a lot of pages that were in the supplemental index or otherwise penalized are now showing up in the top 10-25 results.

Not being a nut-case about tracking these kinds of things, I have checked some of the pages in question and they now have moved up in the SERPs. Am I looking at a basic doubling or tripling of my traffic (and revenue) or should I be worried about losing it all due to "invalid clicks?"

Any comments, guesses, observations would be appreciated. Thanks.

SEOPTI

6:09 pm on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The part which I don't understand about the -50 filter is why they devalue all links even Yahoo, DMOZ and all other links don't count any more.

You can even get as many valuable links as possible after the filter kicked in but it does not play a role any more, it won't improve your ranking. The URLs of the filtered site will stay at -50. This is really harsh.

mirrornl

12:20 am on Dec 14, 2008 (gmt 0)

10+ Year Member



or even as good as -29

tedster

12:57 am on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



after the filter kicked in but it does not play a role any more, it won't improve your ranking

Yes, this definitely happens. Some penalties seem to be just a "minus whatever" flag that won't be budged without a manual reset, whereas others can still be moved. Some penalties might also involve diminishment by a percentage rather than a subtraction, and that percentage either can or cannot change. This is one reason why I don't care for the "minus whatever" names for penalties.

Sometimes there seems to be a manual release of the penalty flag for many sites at one time, and no amount of changing before that date will help at all. It can be pretty hard to figure all these differences from the outside looking in. What I do when I have a situation that won't budge and I can't see any root cause for it is I just move on for a while.

[edited by: tedster at 1:42 am (utc) on Dec. 14, 2008]

SEOPTI

1:41 am on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Interesting, you are right on the gradual penalty/filter realease/recovery tedster, I've been reading a trusted German SEO Blog and the owner there documented the gradual realease of one of his sites using a traffic chart from his analytics software.

Gandhalf

1:47 pm on Dec 16, 2008 (gmt 0)

10+ Year Member



bwnbwn

I did part of this thing you suggested:

"
Do a search using the whole title of the front page on your site minus the domain name and see were your rank is, then add the domain name and see were your rank is. "

- minus my domain name for the title of the homepage (pr6), I only find one internal page at the 150th position

- with my domain name for the title of the homepage, I find another internal page at around the 250th position

Needless to say I used to rank first for both

For a specific internal page (pr5) same type of search: ranks first

So some pages are penalized, some other not.

Any Idea of what kind of penalty i'm suffering ?

bwnbwn

3:47 pm on Dec 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Gandhalf you are under a manual penality for something either cosmetic (Does your site have something that may be considered offensive to any race or nationally?)or do you have a Google webmaster account and checked to see if there are any messages from them (possibility your site may be hacked), or have you been involved with some shady dealings that after a humaan review could result in a discovery and induced filter.

Are you a member here and if not I would suggest the best money you could spend would be to join and after following the guidlines in the review section submit your site for us to review.

Gandhalf I did everthing and I mean everthing to figure this one out (I think I read and read the 950 30 and 60 filters more than once and that is a big big read) and after submitting for my site to be reviewed here and completed the suggested changes when I was reviewed by Google my site was released from under the filter a couple months latter. I am sure some of my changes did help but I feel the review here showed me were and why the filter was applied.

I was placed under a filter by a human and released by a human review no doubt in my mind.

Consider having your site reviewed here [webmasterworld.com...] as I said best money you'll ever spend.

If you do submit your site for review you had better leave your ego at the door and take no offense to what is suggested otherwise you will not get the help you need. This may lead to the discovery but it still could be months before your released.

Gandhalf

4:39 pm on Dec 16, 2008 (gmt 0)

10+ Year Member



bwnbwn,

thx for your feedback.

The only think I can think of, was an "sql injection" that occured like 3 monthes ago
There was a msg in GWT, and I fixed the issue.

They removed the msg in GWT and that was it.

In the meantime, the website has been penalized since june 26th although a search on my title for my homepage got me on the 1st place in the serps.

It looks like one penalty + another occasional penalty ?

Anyway, thx for the idea.

I think I will consider a manual review here

bwnbwn

6:21 pm on Dec 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Gandhalf the message was posted about 3 months ago doesn't mean it happened then, it could have been an issue a couple months before that and got the site filtered.

SEOPTI

10:46 pm on Dec 20, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have analyzed some of the -50 competition sites and what they have in common is a really poor design and content structure, in most cases even no css design at all, just a white background and text.

This supports the theory it could be a manual review. They seem to look for design and content structure.

The next interesting part is I have found sites with 5 million results in the index and sites with a few hundred results both having exactly the same -50 devaluation. The site with 5 million results even has a good backlink profile with links from yahoo and dmoz, but a poor design and ranks for 5 million URLs in the -50 box.

[edited by: SEOPTI at 10:51 pm (utc) on Dec. 20, 2008]

SEOPTI

11:02 pm on Dec 20, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Probably they have too many reviewers, otherwise they would not review sites with just a few hundred URLs or we are just wrong and it's a dumb machine but a machine is not able to judge the design of the site and structure of content.

tedster

11:24 pm on Dec 20, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



a machine is not able to judge the design of the site and structure of content

I'm not sure that's 100% true. Google's doing some pretty sophisticated things these days. For example, did you see the comment from Matt Cutts about automatically examining layouts for large gaps?

...our search algorithm saw a large area on the blog that was due to an IFRAME included from another site and that looked spammy to our automatic classifier.

Google Groups discussion [groups.google.com]

SEOPTI

6:50 pm on Dec 21, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, I saw this comment, I don't use iFrames.

Today I got the second site released from the -950 box by deoptimizing internal anchor text a few weeks ago. Guess where it is now, it's in the -50 box, this seems to be hardcored in the altorightm.

This is really funny!

[edited by: SEOPTI at 6:51 pm (utc) on Dec. 21, 2008]

SEOPTI

7:15 pm on Dec 21, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



And the third site (same procedure), I almost can't believe it, this is foolproof.

[edited by: SEOPTI at 7:15 pm (utc) on Dec. 21, 2008]

gouri

7:48 pm on Dec 21, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Was the internal anchor text a part of the body text on some of your pages?

SEOPTI

3:45 am on Dec 22, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It was the internal anchor body text, not the footer, not the navigation.

Gandhalf

6:21 am on Dec 22, 2008 (gmt 0)

10+ Year Member



SEOPTI, what is an internal anchor body text, if I may ask ?

potentialgeek

9:24 am on Dec 22, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Today I got the second site released from the -950 box by deoptimizing internal anchor text a few weeks ago. Guess where it is now, it's in the -50 box, this seems to be hardcored in the algo.

Any reason why you didn't deoptimize internal anchor text months ago? If memory serves, I'm pretty sure you were in conversations about the 950 Penalty at the time. Removing spammy anchor text is how I and others here also got out of that penalty.

I'm not sure that's 100% true. Google's doing some pretty sophisticated things these days. For example, did you see the comment from Matt Cutts about automatically examining layouts for large gaps?

I agree with Tedster that Google's coders can automate examination of a site's architecture and layout. It's easy to get the code to check for css, background colors, etc. We already know they detect "hidden text" based on page colors, text colors, and the contrast. Of course they've done that for years. You have to anticipate more sophisticated algo updates.

I was placed under a filter by a human and released by a human review no doubt in my mind.

I think we need to give the coders a little more benefit of the doubt in the sense that we should anticipate PhDs can design algos which are smart enough to do things that appear to be done by human review.

Very plain sites could get flagged as possible spam (spammers are lazy), but then examined for many more possible spam signals. I would program the algo to look out for very plain sites (design) with very little content (or original content).

p/g

SEOPTI

2:09 am on Dec 26, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For all who are affected by the -40 -50 or -60 filter, please do a search for your domain name without extension and without www.

If your search preferences are 100 results per page your domain will appear #1, if you set your search preferences to show 10 results per page your domain will not be #1 any more.

I'm quite sure this should work for all (99%) of domains which are affected by this filter.

As long as the domain name without extension can NOT be found #1 for both 100 results and 10 results this filter kicks in and you won't be able to rank for inner URLs.

SEOPTI

2:19 am on Dec 26, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tedster, I've found this stuff coming from Mr. Cutts which refers to the Google Groups discussion, it was made earlier this year:

"If lots of sites have the same template, that's not really a problem. It's when they don't have anything useful and valuable for users that we try to find them and throw them out. We do use HTML templates as a way to find sites/pages that are low value - we can just search for blocks of code and find all the pages using a certain template and if they all tend to fit this pattern of low value, we can toss them all out together."

I think this clearly speaks for manual reviews of sites, which may be the cause of those -40 -50 ... penalties. I think the reviewer checks sites and if he finds a site with low value he checks for similar blocks of content on other sites, this way he finds the sites which belong to the same webmaster.

If they are also low value sites the filter kicks in for all of them.

On the other side there is the comment about the
"... automatic classifier ..." which tells me some penalties are automatic.

[edited by: SEOPTI at 2:45 am (utc) on Dec. 26, 2008]

webguybri

10:36 pm on Dec 26, 2008 (gmt 0)

10+ Year Member



Would this type of -50 etc.... penalty be applied more often for:

1. to many links pointing to page with "exact keyword" link text

2. getting links from blog comments

3. both