I'm really curious to see the results of "cleaning" action performed by webmasters.
Will it matter?
Sounds like it could do two things:
1) Remove low quality links from your backlink profile
2) Tell Bing which sites are a source of low quality links
I think this is significant for webmasters. Folks have been begging Google to implement such a feature to help fight low quality IBLs, and Bing actually moves forward with it.
It doesn't say if they actually remove the links from your inbound link count, or did I miss that somewhere?
It does say not to expect any dramatic results.
Lots of webmasters have been requesting this feature and it sounds like they are actually listening.
Yup, looks like they just set it live. Thanks Bing!
Is Disavow a honeypot?
Is it going to make an impact if you block links?
Would you rather go and ask webmasters to remove links yourself or disavow? what would be your preferred choice?
What if you disavow links which are adding up to the overall quality of your website?
So I assume this means that Bing doesnt simply ignore links perceived as bad?
Otherwise, this tool would be pretty pointless.
Would be interesting if what it does is to penalize all sites referred from the sites submitted to that tool.
|Would you rather go and ask webmasters to remove links yourself or disavow? what would be your preferred choice? |
Many of us have tried to get links removed, but personally I've never gotten them to do it or to even respond.
Bing wants our help in identifying bad sites, I doubt this is anything more.
That's because I have a hard time believing any reports we provide will actually help our own sites directly, search engines don't trust data when webmasters provide it. They view it as manipulative.
|Would be interesting if what it does is to penalize all sites referred from the sites submitted to that tool. |
As a site that links out, how to I find out if any of the sites that I link out to have disavowed an incoming link? I'll obviously never want to link to such a site.
So, begins the "negative negative" SEO. Site A isn't doing too well. Competitor sites B and C are doing great and have some fantastic links from site D.
Owner of site A secretly sets up site E and gets some links to it from site D. Owner of site E then disavows those links from site D in the hope that all of the site D outgoing links are discounted thereby harming site B and C in the process.
Frugal reuse of pertinent posting there :)
|Owner of site A secretly sets up site E and gets some links to it from site D. Owner of site E then disavows those links from site D in the hope that all of the site D outgoing links are discounted thereby harming site B and C in the process. |
Your over thinking this, I think if there is a continual negative response to your site over a period of time then your links will be discounted, no single site could achieve this.
I also think this is a huge quality score for the site that links out. Kudos to Bing for bringing this out.
Come to think of it if site A could get links from site D perhaps they wouldn't be doing so bad in the first place!
It seems to me the search engines should come up with something we can set up in an htaccess file.
Having to do this for for both Google and Bing and any other search engine you want to be listed in can get tiresome.
Interesting idea Lorel
The search engine would need to show the referer then you could disallow based on referer in .htaccess
So, if a page has incoming links from 100 different websites, the searchengine would need to visit your page 100 times in quick succession each time listing a different linking site as the referrer so you could say "yes" or "no" to each one.
Is that how it would work?
Not anything I want to rush into, but kudos to bing for playing the "friend to webmasters" card. They have been doing that a lot lately, and I'm really happy to see that.
Serious how much time do they want we spent on all there "rules" from bing and google, I just dont have time for all that non user/visitor related work, it can not be we have to spend SO much time on non site building work, we dont work for google or Bing.
This needs to work with the prerequisite that the site that has had a link ignored will not be penalised. Simple. That way there is no chance of negative SEO.
Maybe there will be a new form of xml file - like "linkmap.xml" where you list good and bad links and hook it up to webmaster tools.
It would have to be secure at some level (probably the "VERY" secure level), and security by obscurity be good enough, or better yet, invent a new form of htaccess file that lists bad links and good links.
Would it be all that bad if this file was semi-public in that webmasters in the know could look at the file and learn good / bad links, and copy it for their own purposes?
There are a couple of ways to do this - one might be to "open source it" so everyone who mattered could learn, copy, reuse. Or make it super private.
Either way, a whole new series of businesses would be born.
Is this (shouldn't this be) being discussed at an an IETF (Internet Engineering Task Force) or IEEE level by any chance?
>Click here for the newest list of bad links that you should be blocking! :-)
Great idea chewy - if this is the future and all SE's follow this process then this will go a long way to avoiding duplication.
Interesting choice of title. This thread now ranks on page one for "reverse condom", right above medhelp.org.
Hmm.. this new feature, especially if it will be implemented by Google too, may actually help getting links removed rather than needing to Disavow them. An email example such as below may produce better results in removing the links:
Dear Webmaster of www.example.com
Could you please remove my link from your page A to my page Z from your site. Should you not comply within N days, I will be forced to use Disavow Link feature in Webmaster Tools, which *may* send negative signals to search engines about your page/site.
This is bad in my opinion.
It is transferring the burden of who links to your website to your personal responsibility.
You are now responsible for policing incoming links and theoretically on the hook penalty wise if you don't identify questionable links.
We're already "on the hook penalty wise" with penguin penalizing site wide links to our site that we did not set up, i.e., negative SEO (which Google claims can't hurt our sites).
Good step in the right direction. It would be better to just not count links from perceived bad sites. But, I guess that could be manipulated rather easily as well resulting in a free for all with paid links.
Anyone who does this is NUTZ. This is a verbal algo tweak, and nothing more. Any serious SEO knows what works in Bing. This is just an attempt to get those who don't know any better to stop doing it.
|You are now responsible for policing incoming links and theoretically on the hook penalty wise if you don't identify questionable links. |
That's absolute nonsense.
To start... it's a power play to keep nipping at Google's marketshare... nothing more.
In the rarest of cases can someone (other than the owner or others with expressed permission of the owner) harm you through links.
A competitor COULD exploit your own webspam which caused you to rank (ranks you should not have had to start with) and take that out with their webspam... that's possible... even probable... actually 100% a certainty but to say you need to police links you never developed... is categorically wrong.
Police your own actions... and the buck starts/stops there.
| This 33 message thread spans 2 pages: 33 (  2 ) > > |