Featured Home Page Discussion
| This 42 message thread spans 2 pages: < < 42 ( 1  ) || |
|Google Takes Action On European, German Link Networks|
| 2:15 pm on Aug 18, 2014 (gmt 0)|
It appears that a link networks in Europe are getting hit, according to the latest information from Google's search quality team.
|We have taken action on one European and one German linking network. |
| 7:40 am on Aug 21, 2014 (gmt 0)|
|Spam-fighting doesn't have to be "scalable to the entire Web." |
Of course it does. That is Google's whole approach. No individual action on spam reports; instead, fold what you learn from them into your algorithm. It's been pointed out many times that manual doesn't scale.
Notifying people that you've taken manual action is more about discouraging those thinking of starting or using such networks than it is stopping those already engaged. Those already enagaged will have first hand experience of whether networks work or not.
| 8:42 am on Aug 21, 2014 (gmt 0)|
Google's initial approach was quite good and probably could be modified to deal with bad links. Google's call centre approach to resolving issues (attempting to fix problems as they emerged rather than improving the algorithm) is what has damaged the SERPs and made it more vulnerable to exploitation. With Google's call centre approach to dealing with spam problems, dealing with spam is like some global game of 'whack a mole'.
|We have known for a very long time that Google is not good at determining good links from bad links. I am not sure what the surprise is. |
| 2:44 pm on Aug 21, 2014 (gmt 0)|
|Spam-fighting doesn't have to be "scalable to the entire Web." |
Of course it does. That is Google's whole approach.
If it were "Google's whole approach," Google wouldn't have manual penalties, and a manual penalty wouldn't need to be approved by someone higher up the food chain before being implemented.
There's an argument to be made for completely automated penalties, and Penguin and Panda may be steps in that direction. But for now, policy (not just technology) dictates the use of manual penalties.
| 3:57 pm on Aug 21, 2014 (gmt 0)|
Bad choice of words above by me - swap 'approach' for 'ethos' which is more accurate. The point is that this is a company that loves to, indeed has to, automate everything. Data gathered by human assessment is turned into a set of heuristics to be used in lieu of human assessment on a daily basis.
For such a company to take manual action and then publicise it is rather strange to my mind, unless the act is intended to have meaning for others. Is this what you mean by 'policy dictates'?
| 5:57 pm on Aug 21, 2014 (gmt 0)|
|Google's initial approach was quite good and probably could be modified to deal with bad links. |
My experience was that back in the day you could point an anchor text link from almost anywhere and rank. Maybe we are thinking about how good Google was able to handle links from a different perspective.
I actually don't think that link stage was any worse than today. I mean simple economics should come into play at some point. A site may be able to rank for a query, but can they satisfy the user?
| 6:02 pm on Aug 21, 2014 (gmt 0)|
|But for now, policy (not just technology) dictates the use of manual penalties. |
The Policy is FUD. They can't stop spam (not manually, not technologically), but they can stop more legitimate companies from engaging in spam. I think of like the old tv show "Cops" where they never show a suspect get away.
| 6:09 pm on Aug 21, 2014 (gmt 0)|
|For such a company to take manual action and then publicise it is rather strange to my mind, unless the act is intended to have meaning for others. Is this what you mean by 'policy dictates'? |
I simply mean that Google has a policy of using human review before applying penalties.
Google obviously does use algorithms to detect spam (including link networks, the topic of this thread), but it uses human judgment (and human review of that human judgment) in assessing spam penalties.
To use an analogy, it's like the way a department store handles possible or suspected shoplifters. Let's say a woman leaves the store with half a dozen dresses, and one of those dresses has an anti-shoplifting tag that sets off an alarm. If the anti-shoplifting process were entirely automated, a cage might drop down and the alarm system would dial the police. In the real world, however, the store uses human judgment to assess whether the shopper might have made a mistake, the cashier might have forgotten to remove a tag, etc. The store detective might decide, "She paid for five of the six dresses, which means she's a customer and not just a thief, so this time we'll give her the benefit of the doubt."
For a search engine, is a policy of using human judgment in applying penalties a sign of weakness or responsibility? That's in the eye of the beholder. Even if you dismiss the idea that Google could behave responsibly, you might be able to see an advantage to Google in reviewing suspected spam sites or networks manually: After all, someone has to create parameters or guidelines for algorithms, and algorithms aren't going to figure out nuances on their own.
| 7:56 pm on Aug 21, 2014 (gmt 0)|
Do we know how Google found this particular link network? Are you so sure that they found it on their own? Or did they perhaps act on a tip off? They don't respond to individual spam reports but a network might be different.
If they were capable of finding these things on their own, then you'd also think they would also have nullifying actions pre-built into the algorithm.
The more I see link networks in action (and I still do regularly when I'm crunching data for a new niche) the more I'm convinced that Google is not that smart.
The tools I use are freely available and I don't pay much (around £150) monthly to access a few different data sets that I mash together in my own tool for analysis.
It's apparent that you're on a bogus site within seconds. In fact I can usually tell by the url or the domain before I even click through to the page to assess. My own tool allows me to set simple character strings to sift for on the domain or url to trap likely spam sites and I'm no PHD and I don't have billions in the bank.
I have wondered for a while if Google's quest to have the biggest index means they open themselves up to this sort of thing. By indexing as many pages as possible and therefore as many links as possible I wonder if they open themselves up to this sort of mass abuse. I see hundreds, thousands of websites that have no basis for existing whatsoever other than to fuel rankings. And because they are 'stand out' obvious to humans the only links they get are hacked, or from websites with very low acceptance standards (hacked links, SEO directories, bookmarks/pins, fooling link exchange scripts etc). What I mean is that all the links that point into the network are usually rotten. It seems that a decent search engine would be able to catch this.
| 8:07 pm on Aug 21, 2014 (gmt 0)|
|I see hundreds, thousands of websites that have no basis for existing whatsoever other than to fuel rankings. |
I am very familiar with this. The unseen web that props up the web that civilians all know and love.
| 10:02 pm on Aug 21, 2014 (gmt 0)|
|If they were capable of finding these things on their own, then you'd also think they would also have nullifying actions pre-built into the algorithm. |
They probably do, but that doesn't mean there's no value in deterring bad behavior through announcements and manual penalties.
| 2:06 am on Aug 22, 2014 (gmt 0)|
|It's apparent that you're on a bogus site within seconds... |
And because they are 'stand out' obvious to humans the only links they get are hacked, or from websites with very low acceptance standards (hacked links, SEO directories, bookmarks/pins, fooling link exchange scripts etc)... It seems that a decent search engine would be able to catch this.
Yes, a lot of sites are being propped up by this sort of simplistic type of blog network. Penguin does seem to help with this, but they haven't run it in quite some time.
The serious manipulators, though, are making site networks that are truly private...only for their own sites, and perhaps a select few of their peers. And they are buying legit, existing sites to build the network.
And, the really serious manipulators are buying links on "Top 500" type sites you wouldn't think would risk selling links. (hint: writer's salaries aren't that great, and many of them are adept at earning supplemental income)
|that doesn't mean there's no value in deterring bad behavior through announcements and manual penalties. |
Spot on. The real value is creating fear. Collateral damage isn't an issue when fear is the goal.
| 3:20 am on Aug 22, 2014 (gmt 0)|
|Yes, a lot of sites are being propped up by this sort of simplistic type of blog network. |
Google tends to go after the low hanging fruit when it comes to dishing out link penalties. As you noted, the real manipulators getting away with it have their own networks that are not subject to the same weaknesses when links are sold in an open market. Since they own these really private network sites, it's real easy to remove the links and gain a quick approval on their reconsideration request when they get caught. All the big brands using link schemes fall into this category.
With Google's public self back patting when they penalize networks, they will likely gain some new Adwords users because of their war on links. Others will be driven further underground in an effort to remain undetected. Though the short term benefits Google may receive may seem worth it, the time website owners waste trying to please Google by disavowing bad links, requesting link removals, etc. has adversely impacted Google's reputation.
On a < 10,000 Alexa ranked website, over 7% of their traffic originated from link referrals just a few years ago. The same website is now getting less than 2% of its traffic from link referrals and other semi-popular websites I can view traffic stats for all show similar downward trends. With Google controlling so much of the search market, they stand to benefit most from website owners that fear to link out or be linked to. The fewer the links there are, the more dependent people become on a search product to find what they are looking for. As I've said in the past, penalizing a competing source of website traffic does raise important legal issues.
After all this time, and the number of networks penalized, I would think even the most thick headed black hat would stay away from those links. It would seem that publicly advertised blog network links would be more prone to draw customers seeking to harm other websites. Though I understand the reason why many true negative seo victims don't want to come out and say they are penalized, I really wonder how many people use these blog networks for harmful purposes. For a little bit of money, and possibly improving a competing website's rank for a few months, it can sit in purgatory for years once caught with blog network links. For some I'm sure it's an acceptable tradeoff to push a competitor aside for a long period of time.
| This 42 message thread spans 2 pages: < < 42 ( 1  ) |