Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Penalty vs. Ranking Drop - where to draw the line?

         

1script

6:04 pm on Mar 20, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi guys, I've been posting in the SERPs updates thread about this but I'm not sure my predicament has to do with your regular G update convulsions. I guess, I'm looking for general ideas first and maybe save particulars for a different thread.

So, you see your traffic drop 90% in one day across multiple sites (no connection between those) and you find your listings in SERPs way behind 10+ Google Books listing, parked domains, craigslist ads and 200+ other results, from good to, well, questionable to outright spam.

Do you go about this as a drop in ranking and go back and analyze content you added, links you've acquired and all the rest of the SEO related stuff? Or do you consider something like this a penalty and start sending reconsideration requests to Google, move sites to other domains and take other drastic measures?

What's the collective wisdom of this group: where to draw the line between ranking changes and a penalty?

aristotle

6:41 pm on Mar 20, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When you say it happened on multiple sites, how many do you mean? Are they your sites, but no other connection between them?

ken_b

6:45 pm on Mar 20, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Are all the sites on one server?

internetheaven

7:29 pm on Mar 20, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



where to draw the line between ranking changes and a penalty?


I don't really. The way out of a penalty is the same as the way up in rankings in most cases.

The only penalty that really counts as a "penalty" to me is when you disappear from the top 1000 i.e. don't show up at all.

Are all the sites on one server?


I have 28 sites on one server shared between 6 IP addresses all with the same Whois information. Several of them don't rank well at all ... the rest do alright.

The OP stated "multiple" sites, would be better if they could state if all or just a percentage of sites that they owned that have the same server, IP, Whois etc.

1script

11:57 pm on Mar 20, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi guys, thanks for chiming in.

I guess, I'm answering everybody's question first: all sites are not on the same server! Far from it. I rent multiple VPSes, all from different ISPs (but all in US), almost all sites (just a couple exceptions) are on their own dedicated IPs. There are 3-4 sites per server but, again, all have different IPs anyways. I do it not to fool any search engines but to provide some sort of safety margin for my business: if one server falls (and I just had a very nasty episode of 36 hours downtime) others still chugging along. Besides, my sites are forums and as they grow, they become too heavy a load for one server to handle them all.

I run the sites, I own the domains (but I have WHOIS privacy on all), I have my Google Analytics and AdSense code on them. I also have the same Chitika account on all sites. That's about all that's the same between them.

Even then, I have two different Analytics and two different AdSense accounts, both can be found on affected sites. BTW, before the question arises, there is nothing fishy about my two AdSense accounts - opened them 7 years ago for two different businesses - this was just fine back then.

If you were REALLY seriously out to get me, you could possibly gather HTML code from all my sites and cross-reference the AdSense, Analytics and Chitika code with IPs and nameservers and be able to put them together. But why? I don't have any personal enemies, certainly not in the "plex", and none of my competitors (I know of) are bumping up against me in all niches.

Now back to the original question:

Needless to say, the results of 90% drop in ranking are pretty much indistinguishable from a penalty. But I think the way out of penalty could actually be different from repairing your rankings. You can submit a reconsideration request if you're certain it's a penalty. However, in this case I have no convenient "violation" warnings in my WMT. I never had one but I heard they exist. So what am I admitting to in that request?

trakkerguy

2:36 am on Mar 21, 2010 (gmt 0)

10+ Year Member



@1script-

I replied on the other thread about experiencing similar hit on multiple sites myself last year. Always believed common adsense and analytics account are what tied them together.

But my sites were/are still truly "penalized" - they were #50 or lower for search on their domain name. They would still get some random long-tail traffic, but very little.

Do all your sites rank #1 for search of their domain name?

1script

3:28 am on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@trakkerguy

Well, that's the rub - I tested for -50 penalty before I did anything else. No, all sites rank #1 for their domain name with or without TLD, in quotes or without quotes. What else is there to check for any known penalty?

Anyways, as far as how the ranking looks like - it is really weird. Most formerly ranking keywords are either nowhere to be found or around -200 and further out. However, there are some (even some of the long time good traffic ones) that still rank, sometimes #1 on 1,000,000+ results SERPs.

I thought this could be the result of Caffeine update or something like that. However, this ranking picture has been very stable right from the start (all dropped on March 15th) yet many people reported the update live on many different datacenters throughout the week. You'd think that there were times when both regular and Caffeine-type data were live somewhere at the same time yet most of the former ranking keywords never showed up this week. And, again, those few that did show up rank OK as if nothing happened.

In one funny case my site got six (!) listings on Page #1 for a good 500,000 results search: two separate listings in their own right with the second having four indented results. The rest of the guys watching this SERP must be furious but it's not much consolation to me -90 to 95% of the rest are simply missing in action.

I obviously have no idea what exactly is going on (else this thread would not be here) but in layman's terms it appears as if they looked at your list of ranking keywords (all 100,000+ of them, mostly long tail) and disabled your listings on 90% - 95% of them.

Anyones else with a similar picture can chime in here?

tangor

5:51 am on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Not sure what google is doing these days. I don't rely on them anymore. Three of my seven sites have virtually disappeared from g's index but all survive (and some still #1) on Bing. No changes to pages or code... but I do grow weary trying to rise to the top of g. It's like swimming in a pool of tar these days. :(

internetheaven

11:11 am on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have my Google Analytics and AdSense code on them. I also have the same Chitika account on all sites. That's about all that's the same between them.


That can certainly do it, though.

But as long as all your sites with the same Analytics and Adsense accounts have their own good backlink profile I don't see how it could be a problem. Maybe your general SEO tactics that you apply to all sites are no longer useful?

1script

4:13 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Maybe your general SEO tactics that you apply to all sites are no longer useful?

That may very well be of course. However, there is no single "SEO tactic" I use for all the sites. Like I said before, they are all about different subjects and are run as separate businesses.

That means links from different sites, some have viral links from social networking sites, some don't. Some post updates on Twitter (with backlinks to the site via bit.ly), some don't. Some have very reputable .gov sites linking to them (legitimately), some don't. Some have a bunch of WP template links to them but some don't have even one.
They are mostly based on the same forum software (custom) but the templates are very different and also, for some sites the forum IS the site and for others the forum is just a part of the site.

So, again, I would be happy to be able to identify what I was doing wrong in terms of promoting sites or a "general SEO tactic" that no longer works but I'm afraid there is no single thing that is the same for all the sites.

And, again, being very different and disconnected sites they all went down on the 15th or March. I guess, I'm still looking for ideas.

Got a similar experience? Post it here, let's analyze it together!

tedster

5:33 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm beginning to think that your sites may have been hacked and are now hosting some cloaked content (possibly links). Cloaking is the factor that makes it hard for a site owner to diagnose. It is sometimes IP cloaking rather than user-agent cloaking, and sometimes todays hacking criminals install a cookie scheme or .htaccess tricks to prevent easy discovery.

Do you have WebmasterTools set up for these sites? If so, you might see an indication in WMT. You can also use the "Fetch as googlebot" utility to diagnose a page EXACTLY as Google sees it. In addition, there is sometimes an indication that you have parasite content because funky backlinks show up en masse. Such links would have been placed by the hacker to boost their own parasite links.

---------

Another idea comes to mind. Google has been known to do a kind of automated "sweep" for a particular kind of ranking technique and then penalize all the sites that appear to be using that method. It is sometimes the case that a site gets "unfairly" penalized this way, like a dolphin caught in a tuna net, because of some technical similarity to the criteria Google used to build that "sweep".

Such sweeps would be designed to catch some type of violation that is described in Google's Webmaster Guidelines [google.com] - so dedicating a chunk of time to study of all the current wording there might trip an idea for you.

---------

The kind of ranking drops you are describing do not sound like the result of an algo change to me, because the degree of loss is too extreme. The situation you describe has the feel of a "true" penalty to me, and a lot of the time true penalties are link related.

So I would make links - both inbound and outbound - the first point of study. Ironically, Bing may help you understand Google better, because their "linkfromdomain:" operator [bing.com] gives you a very helpful listing about a site's outbound links.

[edited by: tedster at 6:48 pm (utc) on Mar 21, 2010]

crobb305

6:27 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You can also use the "Fetch as googlebot" utility to diagnose a page EXACTLY as Google sees it.


Ted, I have used this feature to fetch, but it appears you have to enter one page at a time? Is there a way to fetch the entire site, that I am not seeing? If you have a large site, and someone has inserted hidden links in an obscure place, they may be hard to find with the fetcher.

tedster

6:43 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, it only retrieves one URL at a time. If you have pages that were previously strong entry pages from search but now are not, I'd say that is a good place to start.

There are some other research challenges in WMT, too, For example if you want to see backlinks to internal pages, then you also need to type in those URLs one at a time. But if you've lost 90% of your traffic over many sites, then you should have some good ideas about which pages have been hurt the most. It is very unlikely that one bad outbound link on a obscure page is going to give you a devastating penalty. Big penalties are usually the result of big patterns.

[edited by: tedster at 7:09 pm (utc) on Mar 21, 2010]

1script

7:02 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@tedster:

Thanks, great ideas! Yes, the sites are constantly under siege from spammers and on some sites I do very extensive IP blocking (.htaccess - based). Hope I did not block any Googlebot's addresses ... But then, again, I don't block same IPs on all sites.


I'll do some extensive "fetch as Googlebot" research and post result here.

As far as linksfromdomain: Bing searches, the one thing that they do reveal is how little I actually link out. All the links that show are only to good reputable sites. I have a policy of not converting URLs in user-submitted content into links out of fear of linking to bad sites. And so Bing only comes up with something like a dozen of links for a 100,000+ pages sites.

Do you think this could be an issue here?

tedster

7:13 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well, Bing is one resource, but if cloaking is being done, then it could easily be specific to googlebot and Bing wouldn't see it.

I bring up cloaked parasite links for a very good reason - this criminal practice appears to be an epidemic right now, and one that is still growing. If you've got ranking problems and you honestly, in your deepest heart of hearts, don't know how you might be violating Google's webmaster guidelines, this is a good place to start.

ken_b

7:15 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How accurate is that Bing "linkfromdomain" operator?

I just used that for my site and got 3 VERY different numbers.

66,000+, 34,800, and 8,434.

I hit the search button 3 times because the first two answers were mind boggling. The third answer might be close.

tedster

8:02 pm on Mar 21, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As with the Google numbers and Yahoo numbers - Bing's massive distributed infrastructure makes such an estimate challenging and you just can't take it to the bank! But in the individual results you can discover specific sites that you didn't know about.

Bad outbound links can happen to even a site with the most intensive monitoring. A few years back, Google itself was hosting a spammy outbound link on one of their informational pages - placed there, apparently, by someone who was on staff until they were caught.

1script

1:12 am on Mar 22, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I bring up cloaked parasite links for a very good reason - this criminal practice appears to be an epidemic right now, and one that is still growing. If you've got ranking problems and you honestly, in your deepest heart of hearts, don't know how you might be violating Google's webmaster guidelines, this is a good place to start.

I'm by no means done checking but there is an apparent trend that I can see. On most of the sites I checked in WMT there is at least one incoming links that's from a bad site that is trying to get their spam post on my site indexed and rank.
After I've followed where the link points to, I've found several more spam post pages. The posts did have the URLs of the bad sites but they were not converted into links - just mentioned in a plain text.

I don't want to sound complacent because I do delete all spam I come across right away. But my line of thinking was that spam without links may be annoyance but not danger. And so I'm finding those spam posts that I've missed from back in 2006 or even earlier - depending on the site's age.

So, can you get in trouble with big G by just mentioning the bad URLs in text, not as a proper HTML formatted, "clickable" link ?

Well, aside from that it does not help that those spam posts usually mention the products they are peddling like 1000 times on the same page - speaking of Google TOS and keyword stuffing ...

You guys think it's a reason enough to go with a finer comb through all the posts all the way back to the beginnings and then send a reconsideration request based on bad links and/or keyword stuffing in user-generated content?

And last but not least: having this many sites to check, I'm cursed with always finding a counter argument: I'm finding more spam on those couple sites that are still standing than on those that were killed. Go figure ...

1script

10:33 pm on Mar 22, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



At a risk of beating up a dead horse, I ran a few test to see if some of my formerly ranking pages could have been affected by parasitic hosting. I don't think that's the case. The pages returned through direct browsing, WMT "Fetch as Googlebot" as well as Rex Swain's HTTP Viewer thrown in for a good measure all appear to be identical. So, whatever beef Google has with me, it has to do with my sites or me personally through the same Analytics account...