Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google vs. Splogs: The October Massacre

But how may of us are victims of "collateral damage"?

         

anax

7:34 pm on Dec 5, 2006 (gmt 0)

10+ Year Member



Everybody hates splogs, me included. "Splogs" are auto-generated scraper blogs that post mainly junk text copied from other sites. They tend to use keyword-stuffed domains (buy-blue-widgets.widgetworld.com), and are of course covered with ads. Their purpose is just to flood the SERPs in the hope that people will land on their pages.

There was a thread here [webmasterworld.com] on WebmasterWorld a few weeks ago from a number of people who had just lost a large part of their Google traffic. I was one of those people: on October 21st my Google traffic dropped by 75% and it has not changed since. Traffic from Yahoo and others did not drop, only Google (and Google was of course my primary source of traffic).

Matt Cutts in his blog recently seemed to suggest that in the past few weeks Google has been implementing a splog filter. (He didn't say so explicitly, but that's how I read him.) I think I'm a victim of "collateral damage" from this splog filter, and I'm guessing maybe some of the others who got caught in this "October Massacre" were also, like innocent people walking along the street who get mistakenly picked up in a police dragnet.

Google should fight splogs. The question is: how can those of us who run clean, white-hat sites avoid getting caught in their dragnet by mistake? There was a seeming hint that WordPress blogs are suspicious; I do use WordPress to manage my site. Should we delete RSS feeds to indicate quality? I certainly don't need them, they are just created automatically by WP. I have registered my domain for a long period; Google could count that. I create all my pages by hand, with no automation other than that of the blog template itself. My site is something of a public service site, and so is the opposite of a splog.

I know all the usual quality signals, and have tried to implement them on the site for more than a year. The splog filter seems to have added some new and different "bad-quality signals," and I'd sure like to know how to make sure I'm not giving them off. Any more ideas?

loudspeaker

7:49 pm on Dec 5, 2006 (gmt 0)

10+ Year Member



I don't know the answer to your questions, but let me just say that making RSS a "red flag" is so ridiculous that I don't think even Google would be considering such a thing, no matter how zealous they are to stamp out "Splogs"...

P.S. If they do, the next thing on the "red flag list" might just be the use of HTML... After all, spammers use it, right?

anax

11:15 pm on Dec 5, 2006 (gmt 0)

10+ Year Member



Yes, using RSS as a flag does sound silly, but we grasp at whatever we can when we're in the dark and in trouble.

I'd be glad to have Google implement a "trusted site" feature as part of sitemaps/webmaster-tools that would let people verify that a site is legitimate and not a splog. I'll enter a captcha code twice a day if they want, if it would bring back the traffic I had before 21 October.