Welcome to WebmasterWorld Guest from 3.228.21.186

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Links & Negative SEO. Why?

     
5:33 pm on Oct 30, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 13, 2005
posts: 716
votes: 0


Something has been bothering me for a long time and I just can't work it out: why is negative SEO allowed to work.

Leaving aside conspiracy theories and the like, I can't understand why Google would not simply discount a bad link. Why would it penalise one? I see the argument that they want to punish spammers/black hatters but that's not logical: it's not rocket science to realise that that would then allow those same unethical folks to be able to hurt a competitor and potentially remove a genuinely good & useful resource as an obstacle.

I don't have an axe to grind here at all - I've never been involved with a site that has been hurt by this (yet!) but I do constantly worry that they could get targeted and as a consequence, I and others spend time that could be better spent elsewhere building defences, reading forums like this one and watching for a negative SEO attack. Which is also slowing down the process of what Google ultimately wants us to provide: a higher quality resource.

Love them or hate them, Google - collectively - is very, very intelligent and I simply can't work out why they wouldn't just simply nullify a bad link. Surely everyone would benefit from that ?!?
6:06 pm on Oct 30, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


Because good, bad or indifferent, they want to discourage the behavior. They're not particularly concerned about things like collateral damage or negative SEO.
6:10 pm on Oct 30, 2014 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:May 24, 2012
posts:648
votes: 2


Why would it penalise one?


My best guess is that the amount of manipulative links that G couldn't reliably detect started to worry them.

So, in addition to finding a technical solution, they wanted to add a non-technical solution to battle the problem.

Instilling fear, uncertainty, and doubt in the webmaster community was that tool. It's certainly a powerful motivator. Imagine, for example, how little the disavow tool would have been used without the notion of "harmful links".
7:02 pm on Oct 30, 2014 (gmt 0)

Full Member

5+ Year Member Top Contributors Of The Month

joined:Feb 22, 2013
posts:265
votes: 0


So that we could spent lots of our time removing links instead of adding new/fresh content.
7:17 pm on Oct 30, 2014 (gmt 0)

Junior Member from IT 

5+ Year Member

joined:Oct 29, 2013
posts: 143
votes: 0


netmeg and rish3, in all honesty, I still fail to see how penalties are a way to "discourage" the practice when it comes to negative SEO. All I can see from webmasters' reports is that the contrary actually is true.

Personally, negative SEO attackers would cause me no trouble because my presence in Google is not relevant, but for people who honestly try to abide by Google's strict rules, it's all backfire.

Looks there's a loop somewhere; gotta find a way to escape it. Somehow. :-/
8:09 pm on Oct 30, 2014 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3508
votes: 386


Be careful about blindly believing the hype about negative links. There is a large amount of misinformation (some intentional and some unintentional). Either way it does help Google in discouraging certain styles of behavior that they do not enjoy.
8:16 pm on Oct 30, 2014 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:May 24, 2012
posts:648
votes: 2


netmeg and rish3, in all honesty, I still fail to see how penalties are a way to "discourage" the practice


I'm not arguing that it was a good idea. Just offering why I think they did it.
8:17 pm on Oct 30, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3476
votes: 781


Does "negative SEO" work equally well against all sites?

Or does it work most reliably when used against sites that lack high-quality inbound links to counterbalance the spammy links?

I'm guessing the second is most likely, and I've read any number of articles by SEO experts that have shared that view. Common sense would suggest that a site owner with few genuine citations and a history of penalties may be more vulnerable than someone who has a clean record.

The notion that Google should simply disregard spammy links ignores the fact that crawling, indexing, calculating the value of links, etc. costs money. It's in Google's interest to encourage site owners to clean up messes that they or their SEOs have made--both to reduce server overhead and to teach site owners and SEOs that being involved in unnatural linking schemes may be more trouble than it's worth.
8:24 pm on Oct 30, 2014 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:May 24, 2012
posts:648
votes: 2


Does "negative SEO" work equally well against all sites?


I'm not sure anyone really knows.

BBC news received a ‘notice of detected unnatural links’ for links to a single page they didn't create.

It's certainly easier for notable sites to make a fuss and get false positives manually revoked.
8:27 pm on Oct 30, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3660
votes: 373


This was an act of desperation on Google's part.

Matt Cutts realized that all of his efforts over the years to fight spam had mostly failed, and Google finally decided that it couldn't be done without sacrificing a lot of good websites at the same time.

Unfortunately, it hasn't worked as Google had hoped, since the search results are just as bad as before, if not worse, and so many spammers and black-hatters are still raking in big profits. In the meantime a fundamental flaw has been introduced into the algorithm.

Out of all of Google's mistakes and failures, this is one of the worst.
9:19 pm on Oct 30, 2014 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3508
votes: 386


Unfortunately, it hasn't worked as Google had hoped, since the search results are just as bad as before, if not worse

We are each entitled to our opinions but I would not call Google search results bad. The serps have never been better because MY sites are getting traffic. Once my sites stop gaining traffic, then the serps will be terrible and the worst ever. I am joking but I just want to raise the point that we need to be careful not to let our personal opinions influence our professional assessments :).

Professionally I think Google serps are doing just fine and I base that on Google's growing market share. They are bordering on being a monopoly in certain areas. If people thought the Google search results were bad then Yahoo wouldn't have been forced to close down their search team and Bing would have a bigger market share. Bing actually has a rewards program that basically pays people to search on Bing and they still have a minority market share. You can love or hate Google, but it appears the public has voted and likes Google's search quality based on their overwhelming use.

Link penalties (real or fake or overhyped) have helped Google stop a good chunk of lower quality websites. It has scared a good bit of webmasters into stop chasing after quick shortcuts and into developing stronger websites with better quality signals or to leave the internet marketing industry. Nothing is perfect and there has been collateral damage. When you look at Google's search market share it is hard to say that their handling of link penalties is a mistake or likely to roll back.
9:30 pm on Oct 30, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


I'm not a huge believer in mass quantities of negative SEO. I'm sure it happens, but I'm equally sure it doesn't happen near as much as is reported. And I'm almost as sure that Google knows it when they see it.
9:52 pm on Oct 30, 2014 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:May 24, 2012
posts:648
votes: 2


And I'm almost as sure that Google knows it when they see it.


How would they tell the difference between crappy links created by the site owner, and crappy links created by a competitor?

(assuming the competitor has the discipline to create the links the same way a self-spamming site owner would)
10:47 pm on Oct 30, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 13, 2005
posts: 716
votes: 0


The notion that Google should simply disregard spammy links ignores the fact that crawling, indexing, calculating the value of links, etc. costs money.


But conversely, it costs them money because they then have to create a disavow tool and support the requests it generates. Plus negative SEO makes the SERPS worse (I am with Goodroi on that one - still think they are the best out there for the moment).

I just find it totally perplexing and very surprising that Google would introduce negative SEO opportunities in the first place and perhaps even more worryingly, not put a stop to them when there is clearly a problem. And there must be a significant problem or they wouldn't have created a disavow tool.

If they just stopped penalising bad links and simply ignored them, a lot of people would benefit - including Google - and a lot of people could spend their time far more productively.

...How would they tell the difference between crappy links created by the site owner, and crappy links created by a competitor?


...and that wouldn't even matter anymore.

[edited by: Simsi at 10:54 pm (utc) on Oct 30, 2014]

10:53 pm on Oct 30, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


How would they tell the difference between crappy links created by the site owner, and crappy links created by a competitor?


History, for one. Also, if someone was looking to take out a competitor and it was so easy and so cheap, how many would limit themselves to one? Google has gotten better and better at detecting patterns.

perhaps even more worryingly, not put a stop to them when there is clearly a problem. And there must be a significant problem or they wouldn't have created a disavow tool.


1. How do we know they aren't putting a stop to them, or at least ignoring them?

2. Why do you think negative SEO is the reason they created the disavow tool?
10:55 pm on Oct 30, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 6, 2006
posts:1191
votes: 41


It's quite simple really.

If Google didn't penalise certain links we'd all get as many as we could pointed at our sites. They may help, they may not. However, heads we win, tails we don't lose.

Links would completely lose their value to Google as a means of sorting the wheat from the chaff.
QED.
10:57 pm on Oct 30, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 13, 2005
posts: 716
votes: 0


1. How do we know they aren't putting a stop to them, or at least ignoring them?


So to clarify netmeg, you're saying you don't think bad links can hurt you?

2. Why do you think negative SEO is the reason they created the disavow tool?


I think: 2 birds, 1 stone.

Links would completely lose their value to Google as a means of sorting the wheat from the chaff.


Which, in my opinion, is how it should be. It was a flawed, if necessary, concept from the get-go.
12:17 am on Oct 31, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


So to clarify netmeg, you're saying you don't think bad links can hurt you?


Some bad links, sure. There's a difference between useless links and bad links. And there's a difference between good patterns and bad patterns.

I think negative seo gets blamed for a lot of other issues, and when no recovery follows disavowal, it just compounds the smokescreen. That's my opinion.
4:05 am on Oct 31, 2014 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:May 24, 2012
posts:648
votes: 2


I think negative seo gets blamed for a lot of other issues.

I agree with this. On the other hand, negative SEO isn't just throwing a gazillion cheaplinks at a site.

You can, for example, buy a small number of the right links, wait 3 months, and manually report it.

Or, you can create massive duplicate content issues, often on the victim's own site, depending on their technology and diligence.

Or, you can just act like a self-spammer, and play the long game. Build links like someone trying to game the system does. Ramp up slowly, over a period of months. Leave a few "accidental" incriminating footprints in the right places.

It's not happening as broadly as some claim, but it is happening. In some specific niches, it's even common.
6:49 am on Oct 31, 2014 (gmt 0)

New User

5+ Year Member

joined:Jan 20, 2011
posts:22
votes: 0


There was a discussion about this some time ago here. So I tried it out and it works, here's how I did.

1. With an old automatic links program I placed about 300 links, within a month or so. It's like the old co-op program from DP. Started 2014-06-16. Time spent: 1 min (as I already had links available)
2. All links with the same anchor, a money keyword the site used once before as anchor.
3. In total 13 unique links before new links came.

Result: The first months was really positive for the site. Then, Penguin 3 came and the site lost most of it's long tail rankings. Went from position 3 to 20 on it's main word. Really can't find it on any other keywords any longer. Keyword I used in anchor the site is nowhere to be seen from a second page position before (a really good keyword).

There are some other stuff that could have affected this site, like its previous links was bad (which I don't think) or the content was too bad. On the other hand Penguin is a link penalty so, this is related to links.

Anyway. I think negative seo works, specially on sites with under 20 unique backlinks.
1:27 pm on Oct 31, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1483
votes: 598


Why would it penalise one?


Because several years ago, Google switched from being a search engine to a penalty engine.

Compare results from Yahoo and Bing, then check Google, in most cases you'll find your page one listings in Bing and Yahoo on page 7 of Google.

The page 4 to 9 area is, from what I've found in my niche, to be a penalty box for otherwise good results.
2:16 pm on Oct 31, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3476
votes: 781


Because several years ago, Google switched from being a search engine to a penalty engine.


To a man who's been hit by a hammer, everything looks like a nail.

Like it or not, Google puts value on links and doesn't have a lot of patience with people who try to manipulate its algorithm. If you can't deal with that reality, robots.txt is your friend.
2:27 pm on Oct 31, 2014 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3508
votes: 386


Google is a search engine that uses positive and negative ranking points. If you ran your own search engine you would likely do the same thing. You would want to use as many different tools as possible to provide search results to satisfy your users.

Webmasters are not Google's primary audience or concern. Google's primary concern is the searching consumer that Google can use to generate advertising revenue.

Yahoo & Bing also use negative link aspects it is just less complained about because webmasters tend to focus on Google's algo. Why wouldn't you focus on Google? In several countries Google has over 90% market share and in the US, Google has more than double market share than Yahoo & Bing combined.

Instead of asking why, you might want to think about why wouldn't a room filled with super smart people use every possible metric when developing their search results?
3:00 pm on Oct 31, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3476
votes: 781


Instead of asking why, you might want to think about why wouldn't a room filled with super smart people use every possible metric when developing their search results?


According to Google, its search algorithm has more than 200 ranking factors. (And that's probably without Panda or Penguin.)

Site owners and SEOs who focus exclusively on links need to take off their blinders and look at the bigger picture.
3:59 pm on Oct 31, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 5, 2011
posts:49
votes: 0



Link penalties (real or fake or overhyped) have helped Google stop a good chunk of lower quality websites. It has scared a good bit of webmasters into stop chasing after quick shortcuts and into developing stronger websites with better quality signals or to leave the internet marketing industry. Nothing is perfect and there has been collateral damage. When you look at Google's search market share it is hard to say that their handling of link penalties is a mistake or likely to roll back.

Absolutely agree here. If we look back, we have changed our practice 3 years ago and stopped chasing links. Instead, we do marketing, real marketing with real people with good content. We are building a stronger brand and a better business. Matt Cutts and others have been saying the same thing for years but we never listened. It is only when you started doing it do you realise that they are right all along.

I know some people don't like to hear this but please don't pour cold water on me. Don't worry, those who wouldn't listen to Matt Cutts have no reason to listen to me anyway.
4:11 pm on Oct 31, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 13, 2005
posts: 716
votes: 0


Link penalties (real or fake or overhyped) have helped Google stop a good chunk of lower quality websites


I am sure that is right. But it has also meant it has a lost a number of high quality websites through negative SEO attacks. Prevalent or not, the logic that Google allows a site can be wiped out by a competitor is what I am questioning here.

we have changed our practice 3 years ago and stopped chasing links.


My question here would be: have you done that primarily because you don't want to risk building "bad" links? Or as much because link value nowadays means time is better spent on other things?

Google is a search engine that uses positive and negative ranking points. If you ran your own search engine you would likely do the same thing. You would want to use as many different tools as possible to provide search results to satisfy your users.


That statement is slightly ambiguous but if you are suggesting the "same thing" is allowing links to hurt a site, then absolutely I would not. I would not want to risk having valuable resources that enhance my offering to be taken out by unscrupulous individuals.
4:43 pm on Oct 31, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 5, 2011
posts:49
votes: 0


My question here would be: have you done that primarily because you don't want to risk building "bad" links? Or as much because link value nowadays means time is better spent on other things?

That's actually a good question Simsi. I think 90% of the reason is the risk of bad links. We basically followed the practices most SEOs did in the last 7 years. We didn't want to change as that's what we knew and change is scary.

I guess what I am trying to say is that all this bad links thing, penguin and negative SEO forced us to focus less on SEO, more on business. I know negative SEO can still happen to us, but hopefully less likely if we do not appear to be chasing links ourselves. I could be wrong 6 months later but this does feel better.
9:39 pm on Oct 31, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


I think google is moving away from backlinks as a ranking signal (or downgrading it). Disavow negative links and forget about them. Before the disavow tool there was no way to communicate bad links. Now there is, it just doesn't work, but at least you'll feel like your doing something.
5:48 am on Nov 1, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1483
votes: 598


Like it or not, Google puts value on links and doesn't have a lot of patience with people who try to manipulate its algorithm.


Thanks for the great laugh with this statement. The algorithm is being gamed all over the place and Google doesn't pick it up. Need examples? I've got hundreds.
6:16 am on Nov 1, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:May 16, 2014
posts:141
votes: 0


Like it or not, Google puts value on links and doesn't have a lot of patience with people who try to manipulate its algorithm.


I'm not one to file spam reports, but it's amazing how some people talk one behavior, yet have a ton of doorway pages and manipulated links on/ or leading to their main site.

There is much to be learned from examples such as that, the least of which is that what Google publicly announces often isn't followed in practice.
This 48 message thread spans 2 pages: 48