Forum Moderators: Robert Charlton & goodroi
We have taken action on one European and one German linking network.[twitter.com...]
If they need to take specific action, this means that this is something that their algorithm cannot deal with.Strange how the true believers never mention such things. Perhaps Google's earlier algorithm could have made it easier to detect and neutralise such activity but the Panda and Penguin muppetry has banjaxed Google's algorithm to such an extent that it is now more a patchwork of questionable fixes than an algorithm. These actions confirm something very unpalatable for Google and its fans - Google's algorithm is exploitable.
If they need to take specific action, this means that this is something that their algorithm cannot deal with.
In fact it's because of these shortcomings and defects that Google still needs to impose manual penalties, or even needs to have a spam team at all.
How's Google going to find that? They can't, not even by hand.
Context. Link Aquisition can be graphed. Even Google could find it eventually.
Or maybe it's simply more efficient and cost-effective to enalize certain types of offenses, networks, etc.
The only reason it might currently be more efficient and cost effective, even if you make that assumption, is because of the shortcomings and defects in the current algorithm.
The phrase "shortcomings and defects" makes sense only if Google is trying to fight all spam algorithmically and is failing in the attempt.
Fighting spam manually is not "scalable" to the entire web in a cost-beneficail way when compared to being able to take algorithmic "spam fighting" actions, especially over the long-run.
Spam-fighting doesn't have to be "scalable to the entire Web."
Maybe I'm just dumb?
Spam-fighting doesn't have to be "scalable to the entire Web." {For one thing, spam doesn't occur on the entire Web.)It does. Even the slow learners in Google realise this. A general solution to a particular spam problem is far more efficient and easier to deploy on an index (or multiple indices) than a hand tweaked solution affecting a few websites. As for spam not occurring on the entire Web, what absolute rubbish! Spam happens because Google's algorithm and those of other search engines are exploitable. There will always be people trying to gain some competitive advantage whether by whitehat or blackhat means. People running search engines (Google, Bing, Yandex et al) and building search engine indexes (neither of which seems to apply to you) are always looking for elegant and effective solutions to spam problems.
In other words, most of the MFA crowd has very little skin in the game when compared to a normal small business.
Webmaster World seems to attract more of the latter than the former, which may be why these threads often end up being echo chambers where (for example) the participants complain that Google is trying to make them buy AdWords.
If they need to take specific action, this means that this is something that their algorithm cannot deal with.
[edited by: martinibuster at 6:06 pm (utc) on Aug 20, 2014]
51% good results is good enough to keep the shareholders happy
I've tried Adwords, and am no pro, but it seems to be a bad direction for many small business owners to go in my experience.