homepage Welcome to WebmasterWorld Guest from 54.198.8.124
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Marketing and Biz Dev / Link Development
Forum Library, Charter, Moderators: martinibuster

Link Development Forum

    
SEOmoz Developing a Spam Detection Algorithm
To "help...marketers... decide whether to acquire certain links"
martinibuster




msg:4450933
 5:16 pm on May 8, 2012 (gmt 0)

SERoundtable breaks the news here [seroundtable.com].

According to Rand Fishkin:

Some of our team members, though, do have concerns about whether SEOs will be angry that we're "exposing" spam. My feeling is that it's better to have the knowledge out there (and that anything we can catch, Google/Bing can surely better catch and discount) then to keep it hidden. I'm also hopeful this can help a lot of marketers who are trying to decide whether to acquire certain links or who have to dig themselves out of a penalty (or reverse what might have caused it).


Barry Schwartz says this:

One thing Rand gets himself in trouble with, or has gotten in trouble with is "outing" SEOs for spamming. He has tarnished his reputation amongst some SEOs in doing so...


In a Facebook SEO Discussion group I belong to a typical comment was that Rand Fishkin has taken outing to a new level.

What do you think?

 

smallcompany




msg:4451485
 7:30 pm on May 9, 2012 (gmt 0)

How can they distinguish between spammy and not-spammy?
This can be very misleading and distracting.

Search engines decide on giving + or - points. And their algorithms are secretive.
No matter how much one knows, you're never sure.

A good business idea though, as a lot of people will be willing to do it in a desperate try to get the lost traffic back (Panda, Penguin, or whatever else).

Planet13




msg:4452364
 7:06 pm on May 11, 2012 (gmt 0)

Well, I think that SEOmoz should first concentrate on giving more comprehensive link reports. I find that the seomajestic reports are far superior, as was yahoo site explorer.

there was another link analysis tool (I think based in the Ukraine or Belarus) that had something of a pretty good metric for approximating the "value" of a link. I don't think it dealt with web spam per se, but I do think it was a pretty good first step in determining whether a link might be worth chasing.

But really, one's own judgement would be the best "anti spam" tool, wouldn't it? I mean, it's sort of like jazz; if you gotta' ask someone what spam is, then you will never get it, right?

A good business idea though, as a lot of people will be willing to do it in a desperate try to get the lost traffic back (Panda, Penguin, or whatever else).


good point.

BaseballGuy




msg:4453108
 6:43 am on May 14, 2012 (gmt 0)

I'm not sticking up for SEOmoz....but their link tool has helped me uncover a few things that I could not with the other tools.

Might give the Majestic guys my money next month and see if the reports are more indepth.

Robert Charlton




msg:4453134
 9:13 am on May 14, 2012 (gmt 0)

a typical comment was that Rand Fishkin has taken outing to a new level.

At best, he has taken social linkbaiting to a new low. Let's hope Rand's not serious about outing sites in the ways proposed. Beyond the questionable ethics of it, he would be releasing a tool where false negatives might be as dangerous in the long run as false positives.

I can perhaps see legit value of such scoring... if a site's spam-score were available only to the registered owners of the sites being scored. Absent feedback from Google, an outside view of one's own site might for some be of interest. But making simplistic spam scores available publicly, thus exposing sites... particularly sites trying to build themselves out of hole... would, IMO, be unconscionable.

Equally disturbing to me is the confused logic trying to justify selling the information. On the one hand, it's being pushed as a tool to help webmasters and SEOs build Google-friendly links, but, as I remember Google's guidelines, Google wants to rank sites on the basis of "editorial votes" that are "freely given".

Not being naive about it, I nevertheless feel compelled to ask... how is a tool that gives SEOs a "spam score" of other sites on the web going to help build such freely given links? That's muddying the waters quite a bit. Assuming Google is smarter or more discriminating than SEOmoz or its customers, now or in the future, it might be downright dangerous. This is the false negative I was talking about.

As a tool for cleaning up sites... I think current tools are adequate, and they don't go around carving scarlet letters.

Regarding the possibility of social linkbaiting... I truly hope all this is only linkbait, perhaps <conjecture> leading up to a fallback position where, say, "after consultation with lawyers about the danger of false positives", SEOmoz decides to limit access to the site information to the owners of those sites, who in turn agree to register and confer some legitimacy to SEOmoz's mythical crawler (see notes below). Arguably, this would be a questionable way to get an opt-in, as those sites not appearing in the SEOmoz index might by implication be tarred with the same brush as spammers, but, moving forward since its venture financing of a couple of weeks ago, SEOmoz might want to be seen as having a well-behaved bot with a named user-agent. </conjecture>

There would also be the question of ownership of such data. Regarding that, and the SEOmoz crawler, see these discussions...

SEOmoz Crawler/Bot
http://www.webmasterworld.com/search_engine_spiders/3759661.htm [webmasterworld.com]

The SEOmoz Linkscape Ghost
by Pierre Far
[ekstreme.com...]

Ecommerce: The Ethics And Value In Scraping, or Data Mining
http://www.webmasterworld.com/ecommerce/3902682-5-30.htm [webmasterworld.com]

Andy Langton




msg:4453142
 9:23 am on May 14, 2012 (gmt 0)

Some very good points there, Robert. The whole idea is confused and unhelpful IMO.

help...marketers... decide whether to acquire certain links


So, it's to help you decide whether to "acquire" a jolly natural link from someone? No, it's to make sure you don't accidentally-naturally-acquire a bad link?

If the data exactly matched Google's own spam detection it might be a different matter, but as far as I can tell, it will either need to be extremely basic (i.e. identify very obvious hacked/#*$!/drug-related links) or it will be making questionable judgements about websites and encouraging webmasters to act upon that information.

Frankly, the people most likely to benefit from a tool that attempts to replicate Google's link spam judgements are link spammers. Run your link database through SEOMoz before you fire up Xrumer.

Robert Charlton




msg:4453144
 9:30 am on May 14, 2012 (gmt 0)

Run your link database through SEOMoz before you fire up Xrumer.

In that case, maybe webmasters will pay SEOmoz to have their sites tagged as spam. ;)

Andy Langton




msg:4453145
 9:32 am on May 14, 2012 (gmt 0)

In that case, maybe webmasters will pay SEOmoz to have their sites tagged as spam


Cloak for spam detection! Not a bad idea, actually ;)

Robert Charlton




msg:4453187
 11:12 am on May 14, 2012 (gmt 0)

Cloak for spam detection!

That's why there's no disclosed user agent, Andy. SEOmoz doesn't want white hat sites to pretend to be worse than they are. ;)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Link Development
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved