Welcome to WebmasterWorld Guest from 54.144.72.150

Forum Moderators: mack

Message Too Old, No Replies

Reverse Link Blocking - Bing Style

     
4:11 pm on Jun 29, 2012 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38048
votes: 11


Today we’re announcing the Disavow Links feature in Bing Webmaster Tools. Use the Disavow Links tool to submit page, directory, or domain URLs that may contain links to your site that seem "unnatural" or appear to be from spam or low quality sites. This new feature can be easily found in the Configure Your Site section of the navigation.


Using the Disavow Links tool, you can easily and quickly alert Bing about links you don’t trust. Using the drop-down, you can choose to input signals to us at the page level, a directory level or at a domain level. We’ll note the “type” of location (page, directory or domain) and the date you told us of the action. [bing.com...]
4:44 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:June 2, 2006
posts:2112
votes: 2


I'm really curious to see the results of "cleaning" action performed by webmasters.
Will it matter?
5:02 pm on June 29, 2012 (gmt 0)

Full Member from US 

5+ Year Member

joined:May 16, 2006
posts: 255
votes: 0


Sounds like it could do two things:

1) Remove low quality links from your backlink profile
2) Tell Bing which sites are a source of low quality links
5:22 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 12, 2006
posts:1304
votes: 0


I think this is significant for webmasters. Folks have been begging Google to implement such a feature to help fight low quality IBLs, and Bing actually moves forward with it.
5:29 pm on June 29, 2012 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ken_b is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 5, 2001
posts:5668
votes: 60


It doesn't say if they actually remove the links from your inbound link count, or did I miss that somewhere?

It does say not to expect any dramatic results.

Worth trying.
5:36 pm on June 29, 2012 (gmt 0)

Full Member

10+ Year Member

joined:Jan 2, 2005
posts:330
votes: 0


Lots of webmasters have been requesting this feature and it sounds like they are actually listening.

+1 Bing
6:04 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:July 26, 2006
posts: 1619
votes: 0


Yup, looks like they just set it live. Thanks Bing!
6:18 pm on June 29, 2012 (gmt 0)

Full Member

5+ Year Member

joined:Jan 9, 2007
posts:254
votes: 0


Is Disavow a honeypot?

Is it going to make an impact if you block links?

Would you rather go and ask webmasters to remove links yourself or disavow? what would be your preferred choice?

What if you disavow links which are adding up to the overall quality of your website?
6:34 pm on June 29, 2012 (gmt 0)

Full Member

10+ Year Member

joined:Jan 2, 2005
posts:330
votes: 0


So I assume this means that Bing doesnt simply ignore links perceived as bad?

Otherwise, this tool would be pretty pointless.
6:50 pm on June 29, 2012 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 30, 2005
posts:120
votes: 2


Would be interesting if what it does is to penalize all sites referred from the sites submitted to that tool.
7:06 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:July 26, 2006
posts: 1619
votes: 0


Would you rather go and ask webmasters to remove links yourself or disavow? what would be your preferred choice?


Many of us have tried to get links removed, but personally I've never gotten them to do it or to even respond.
10:22 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member

joined:Apr 14, 2010
posts:3169
votes: 0


Bing wants our help in identifying bad sites, I doubt this is anything more.

That's because I have a hard time believing any reports we provide will actually help our own sites directly, search engines don't trust data when webmasters provide it. They view it as manipulative.
10:26 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Would be interesting if what it does is to penalize all sites referred from the sites submitted to that tool.

As a site that links out, how to I find out if any of the sites that I link out to have disavowed an incoming link? I'll obviously never want to link to such a site.

So, begins the "negative negative" SEO. Site A isn't doing too well. Competitor sites B and C are doing great and have some fantastic links from site D.

Owner of site A secretly sets up site E and gets some links to it from site D. Owner of site E then disavows those links from site D in the hope that all of the site D outgoing links are discounted thereby harming site B and C in the process.
10:33 pm on June 29, 2012 (gmt 0)

Senior Member from FR 

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Feb 15, 2004
posts:6717
votes: 230


Frugal reuse of pertinent posting there :)
11:06 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:730
votes: 18


Owner of site A secretly sets up site E and gets some links to it from site D. Owner of site E then disavows those links from site D in the hope that all of the site D outgoing links are discounted thereby harming site B and C in the process.


Your over thinking this, I think if there is a continual negative response to your site over a period of time then your links will be discounted, no single site could achieve this.

I also think this is a huge quality score for the site that links out. Kudos to Bing for bringing this out.

Come to think of it if site A could get links from site D perhaps they wouldn't be doing so bad in the first place!
11:22 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 4, 2002
posts: 1785
votes: 2


It seems to me the search engines should come up with something we can set up in an htaccess file.

Having to do this for for both Google and Bing and any other search engine you want to be listed in can get tiresome.
11:31 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:730
votes: 18


Interesting idea Lorel

The search engine would need to show the referer then you could disallow based on referer in .htaccess
11:45 pm on June 29, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


So, if a page has incoming links from 100 different websites, the searchengine would need to visit your page 100 times in quick succession each time listing a different linking site as the referrer so you could say "yes" or "no" to each one.

Is that how it would work?
3:57 am on June 30, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 28, 2001
posts:1380
votes: 0


Not anything I want to rush into, but kudos to bing for playing the "friend to webmasters" card. They have been doing that a lot lately, and I'm really happy to see that.
9:15 am on June 30, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3443
votes: 1


Serious how much time do they want we spent on all there "rules" from bing and google, I just dont have time for all that non user/visitor related work, it can not be we have to spend SO much time on non site building work, we dont work for google or Bing.
9:30 am on June 30, 2012 (gmt 0)

Full Member

10+ Year Member

joined:Jan 2, 2005
posts:330
votes: 0


This needs to work with the prerequisite that the site that has had a link ignored will not be penalised. Simple. That way there is no chance of negative SEO.
10:37 am on June 30, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 7, 2003
posts: 1048
votes: 0


Maybe there will be a new form of xml file - like "linkmap.xml" where you list good and bad links and hook it up to webmaster tools.

It would have to be secure at some level (probably the "VERY" secure level), and security by obscurity be good enough, or better yet, invent a new form of htaccess file that lists bad links and good links.

Would it be all that bad if this file was semi-public in that webmasters in the know could look at the file and learn good / bad links, and copy it for their own purposes?

There are a couple of ways to do this - one might be to "open source it" so everyone who mattered could learn, copy, reuse. Or make it super private.

Either way, a whole new series of businesses would be born.

Is this (shouldn't this be) being discussed at an an IETF (Internet Engineering Task Force) or IEEE level by any chance?

>Click here for the newest list of bad links that you should be blocking! :-)
10:51 am on June 30, 2012 (gmt 0)

Full Member

10+ Year Member

joined:Jan 2, 2005
posts:330
votes: 0


Great idea chewy - if this is the future and all SE's follow this process then this will go a long way to avoiding duplication.
6:28 pm on June 30, 2012 (gmt 0)

Junior Member

joined:May 24, 2011
posts: 58
votes: 0


Interesting choice of title. This thread now ranks on page one for "reverse condom", right above medhelp.org.

:)
1:48 pm on July 1, 2012 (gmt 0)

Moderator from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2507
votes: 140


Hmm.. this new feature, especially if it will be implemented by Google too, may actually help getting links removed rather than needing to Disavow them. An email example such as below may produce better results in removing the links:

Dear Webmaster of www.example.com
Could you please remove my link from your page A to my page Z from your site. Should you not comply within N days, I will be forced to use Disavow Link feature in Webmaster Tools, which *may* send negative signals to search engines about your page/site.
Regards...
2:07 pm on July 1, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 23, 2002
posts:659
votes: 0


This is bad in my opinion.

It is transferring the burden of who links to your website to your personal responsibility.

You are now responsible for policing incoming links and theoretically on the hook penalty wise if you don't identify questionable links.
5:37 pm on July 1, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 4, 2002
posts: 1785
votes: 2


We're already "on the hook penalty wise" with penguin penalizing site wide links to our site that we did not set up, i.e., negative SEO (which Google claims can't hurt our sites).
3:07 pm on July 2, 2012 (gmt 0)

Full Member

5+ Year Member

joined:Oct 19, 2007
posts:209
votes: 0


Good step in the right direction. It would be better to just not count links from perceived bad sites. But, I guess that could be manipulated rather easily as well resulting in a free for all with paid links.
5:35 pm on July 2, 2012 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 21, 2000
posts:69
votes: 0


Anyone who does this is NUTZ. This is a verbal algo tweak, and nothing more. Any serious SEO knows what works in Bing. This is just an attempt to get those who don't know any better to stop doing it.
3:55 am on July 3, 2012 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member fathom is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 5, 2002
posts:4110
votes: 109


You are now responsible for policing incoming links and theoretically on the hook penalty wise if you don't identify questionable links.


That's absolute nonsense.

To start... it's a power play to keep nipping at Google's marketshare... nothing more.

In the rarest of cases can someone (other than the owner or others with expressed permission of the owner) harm you through links.

A competitor COULD exploit your own webspam which caused you to rank (ranks you should not have had to start with) and take that out with their webspam... that's possible... even probable... actually 100% a certainty but to say you need to police links you never developed... is categorically wrong.

Police your own actions... and the buck starts/stops there.
This 33 message thread spans 2 pages: 33