Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Is Reporting Webspam to Google as Ineffective as a Sugar Pill?

1:25 am on Feb 22, 2013 (gmt 0)

Preferred Member

joined:Oct 15, 2011
votes: 0

I've read some interesting discussions involving a large network of directories (in the hundreds) that are hacking sites to place hidden links on them for pagerank. One particular source seems to be the leader of the pack as their link was found hidden inside a very popular American Legion website. The network admin of the American Legion site even confirmed that they were hacked. Pakistani government websites have been hacked for these links along with many other high profile targets.

From what I can tell, dozens of people have reported these directories to Google one by one. That's a lot of work considering there are hundreds of directories using the same techniques to balloon their pagerank. How long does Google normally take to respond to these webspam reports? Is reporting sites like taking a sugar pill as others have noted?

[edited by: tedster at 3:26 am (utc) on Feb 22, 2013]
[edit reason] No specific accusations, outings - please [/edit]

3:55 am on Feb 22, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

Spam reports can seem to be ineffective if you're expecting a change in the situation over the short term. That's a rare occurrence - I've only seen it happen when it's something really out there, like adult material showing up on search terms that are often done by young kids.

Google prefers to study the reports, looking for the spam patterns and trying to fix their algorithms to catch it in an automated fashion.
11:34 am on Feb 22, 2013 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 7, 2003
posts: 750
votes: 0

I have personally reported dozens of sites. In my experince, Google doesn't act on the reports.

At one point one of my competitors was buying links from a really shady blog network. Every few days for months, the links he was buying would pop up in my news reader where I had a Google alert set up to watch my competitor. I dutifully reported every single shady site he got a link on. Nothing happened to my competitor's rankings. After months, the links finally stopped showing up in the alert (although I think my competitor is still buying shady links). Google must have updated an algorithm to recognize the sites as spam enough to at least not put them into alerts. Hopefully the links are hurting my competitor now (or at least not helping).
11:37 am on Feb 22, 2013 (gmt 0)

Full Member

10+ Year Member

joined:July 21, 2005
votes: 0

"Is Reporting Webspam to Google as Ineffective as a Sugar Pill?"

No, it's not. Sugar pills are more effective, they have a placebo effect.
2:56 pm on Feb 22, 2013 (gmt 0)

Preferred Member

10+ Year Member

joined:June 24, 2005
posts: 446
votes: 0

In my experience they don't work. Submitted one for a blatent link buyer with a terrific summary of the evidence...and nothing. They're still #1. Perhaps this worked at one point, and google got swamped with requests so just ignores them or just focuses on hackers? Who knows...
3:09 pm on Feb 22, 2013 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 14, 2012
votes: 0

Have had GREAT success reporting, and the Google removing, webspam. Particularly .info affiliate sites...
3:33 pm on Feb 22, 2013 (gmt 0)

New User

5+ Year Member

joined:Feb 22, 2013
votes: 0

About as effective as going skydiving and pulling the chute release to find it was packed with confetti. Reporting sites to Google is a big fail. I doubt Google even reads them. I quit sending them a long time ago because they never took action on the ones I sent.
5:28 pm on Feb 22, 2013 (gmt 0)

Preferred Member

5+ Year Member

joined:Feb 18, 2013
votes: 0

These are the things I always seem to 'get lost' on, cause it seems so ineffective today, but then I remember someone from there sayin something about how they don't like to take immediate manual action since there's too many results to manage manually so they try to write algos to detect the spam based on reports when there's enough similarities in the reports and type of spam reported to find a pattern.

I guess it sort of makes sense that way since if they write an algo they get not only todays reported spam out, they get the new spam that fits the same pattern tomorrow and even next year.