homepage Welcome to WebmasterWorld Guest from 54.211.7.174
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google's Anti-Spam Challenge
engine




msg:4249316
 10:58 am on Jan 5, 2011 (gmt 0)

Google's Anti-Spam Challenge [guardian.co.uk]
the problem that plagued the first generation of search engines such as Altavista now seems to be gaining traction on Google, which outdistanced those earlier rivals precisely because it dumped the spam so effectively.

Atwood consulted Matt Cutts, Google anti-spam king:

"We did a ton of due diligence on webmasters.stackexchange.com to ensure we weren't doing anything overtly stupid, and uber-mensch Matt Cutts went out of his way to investigate the hand-vetted search examples contributed in response to my tweet asking for search terms where the scrapers dominated. Issues were found on both sides, and changes were made. Success!"

Except it isn't really success. It's a temporary respite. As Atwood points out moments later, "Anecdotally, my personal search results have also been noticeably worse lately. As part of Christmas shopping for my wife, I searched for 'iPhone 4 case' in Google. I had to give up completely on the first two pages of search results as utterly useless, and searched Amazon instead."


 

FranticFish




msg:4249411
 4:44 pm on Jan 5, 2011 (gmt 0)

As long as Google try to trap this exclusively algorithmically I think they'll fail.

Why not actually try some form of human intervention?

What I usually see is that spam is facilitated by networks of sites, and although I discover a network because I analysed one site, that network is being used to help hundreds of other sites. Manually taking down the network's ability to pass PR or anchor text would probably stop hundreds of sites ranking. Placing warnings (like the malware warnings) on participating sites would also send a clear message.

Where the network is traceable back to an agency (and some are) then I personally would kill their site(s) too, denounce them publicly on a Google blog, and place a warnings in the SERPs.

Spam is a huge problem, and even a fairly large dedicated human team might only be able to act on a small percentage of the reports.

But on the other hand humans can spot spam instantly - and without the 'false positives' that plague algorithmic detection.

And I think this would be far more of a deterrent. If word got around that spam reports were actually acted on by humans and manual penalties were handed out, might that not deter people more than the current situation?

steerpikegg




msg:4249479
 6:59 pm on Jan 5, 2011 (gmt 0)

Placing warnings (like the malware warnings) on participating sites would also send a clear message.


I'll have to be brief as I am on the train, but I think that to tar and feather a company / site because they have a network of sites etc would be completely wrong and against the principles of the web. PR / keywords et al are something entirely of Google's own making and it is up to them to control how they use it. The whole point of the internet is that it is boundless.

If someone wants to set up a network of sites, or create a site that consists of the same word (keyword?) repeated thousands of times, then that is entirely their own perogative as long it does not break any statutory laws. If Google choose to index this and rank it, then it really is their lookout. They could quite rightfully remove them from the index, but nothing would give them the legal right to label the owner of said sites negatively.

I do not condone spam or those who originate it, but Google are not the 'police' of the internet and have zero right to say what any person can or cannot do with their website / links / content.

The fact that Google can come along and index/cache every page of your site without your explicit consent is in itself morally and legally dubious - indexing of pages by search engines should be strictly opt-in for better or worse. This alone may cut down a considerable amount of poor quality serps due to Googles mass consumption of pages that were perhaps never intended to be indexed in the first place.

This is not intended to be inflamatory, but just a statement of the facts. Google are not the internet; they are not the police; they must make the best of the hand that is dealt them when they choose to index the web at large.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved