| 10:15 pm on Oct 16, 2012 (gmt 0)|
What took so long?
| 10:18 pm on Oct 16, 2012 (gmt 0)|
Thanks to scrapers some people will have to make this disavow tool a new career ;)
| 10:26 pm on Oct 16, 2012 (gmt 0)|
I hope there is a root disavow feature where you can wipe out a whole domain rather than each individual page from that domain. That can probably save lots of time.
| 10:34 pm on Oct 16, 2012 (gmt 0)|
@SevenCubed, the blog post actually explains that you can use a domain wide directive in the file you upload.
My question is ...is it effective against Penguin ? All the language they used was about "unnatural link notices" and manual spam actions. What about Penguin ?
| 10:37 pm on Oct 16, 2012 (gmt 0)|
Not sure how long Google has been working on this, but they probably didn't have a choice about implementing it since Bing Webmaster Tools has a "Disavow Links" function.
| 10:40 pm on Oct 16, 2012 (gmt 0)|
Whoa! Wait a minute! This not NOT what I had in mind when this was first talked about. It kills ALL THE LINKS(!) to a given website.
What kind of a tool is this? They might as well have came up with a "Kill your site now" Webmaster Tools feature.
If you disavow every single link to a site, you might as well simply abandon the site - why bother?
| 10:43 pm on Oct 16, 2012 (gmt 0)|
1script, yeah at first glance it does look like you're about to disavow links on the entire domain.
But if you click the button, you'll see that they allow you to control which links exactly you don't like/want, and not all of them.
And even if you accidentally disavow all your links, you can undo it ..according to the blog post.
It will be interesting to see tests done on just how effective this is.
[edited by: klark0 at 10:45 pm (utc) on Oct 16, 2012]
| 10:45 pm on Oct 16, 2012 (gmt 0)|
|@SevenCubed, the blog post actually explains that you can use a domain wide directive in the file you upload. |
Busted! now you know I didn't actually read the blog. Just the title was good enough as far as I was concerned. Glad to see that they do have it. Thanks.
| 10:49 pm on Oct 16, 2012 (gmt 0)|
Why do I have a feeling this is going to end up worse than handing out loaded guns to schoolkids?
| 10:54 pm on Oct 16, 2012 (gmt 0)|
So how are WE supposed to KNOW which links are good and which are bad? Someone wanna start a link ranking website?
| 10:55 pm on Oct 16, 2012 (gmt 0)|
First impressions :
- I really bothers me that Matt describes using this because you forum spammed, bought links, were in a link network, etc., etc... there is no mention at all of negative seo from a competitor... they just assume everyone had a choice in getting these 'bad' links. I know I have some horrible links and had nothing to do about it nor can I DO anything about. From reading this forum there are others in this same boat as well.
- Before this was announced I was pretty bitter about the whole 'bad' links affecting ones site. Particularly for the reason mentioned above where I did nothing wrong to get 'bad' links, but yet could do nothing about it either (we can't control another individuals site nor can we force them to do anything... usually).
My original thought was simple... if Google deems a site as 'bad' then just make those links zero, zilch, nothing - they pass no negative value nor positive for that matter. This solves both scenarios... paid links are worthless and negative seo (as an example) has no bearing on the target site. This also somewhat automates the process.
While I am glad they released this, as it was much needed, I still feel the above resolution would have been better. This just creates something else to spend tons of time on for the average webmaster where it could be better spent elsewhere with the site in question.
Happy? Yes, I am, but feel it could have been handled better by the brains at big G.
| 10:56 pm on Oct 16, 2012 (gmt 0)|
|Why do I have a feeling this is going to end up worse than handing out loaded guns to schoolkids? |
Don't know about you, but this Google punch sure is mighty tasty! Here, have some!
| 10:56 pm on Oct 16, 2012 (gmt 0)|
I suggest the folks who have questions read the blog post and watch the video. All of the questions and misconceptions above are addressed.
|And even if you accidentally disavow all your links, you can undo it ..according to the blog post. |
But as Matt explains in the video (required viewing) while it might take weeks for the disavow to take effect, it likely will take longer for an undo to take effect.
Don't use this tool without the required reading and viewing.
With this thing called the Internet these days there is absolutely no excuse to get your information second hand. There is only one way that *you* can be sure that it is accurate.
| 10:57 pm on Oct 16, 2012 (gmt 0)|
|Why do I have a feeling this is going to end up worse than handing out loaded guns to schoolkids? |
Because a lot of folks think it's the ultimate tool for a webmaster, and will start using it to disavow anything they feel isn't an authority source.
| 11:04 pm on Oct 16, 2012 (gmt 0)|
The question I have is also about identifying which links are particularly bad.
I was hit by Penguin long I received the mid July "unnatural link" notice that everyone seemed to get where the language was later softened.
I wish I had some feel for the progression of the Unnatural Link notices, since I received mine later than some batches that went out earlier. Should I assume that the threshold progressively got more strict? Or should I assume that it was a new link that was found close to that date that precipitated it?
My links aren't too bad, and none have been solicited in more than 5 years. When looking at which links to disavow would you be looking more at the timeframe around the notice? Or is the timing of a notice more or less irrelevant?
| 11:17 pm on Oct 16, 2012 (gmt 0)|
So happy this tool has been finally released and am looking forwarding to submitting the last of my fake SEO links that I couldn't get scrubbed. While Matt Cutts never said this would affect Penguin hit sites, he never said it wouldn't. Reading between the lines and after watching the 10 minute video, I suspect this will be about Penguin hit sites. In fact...they may have rolled this out to pave the way for the next Penguin algo update which is long overdue.
For those struggling to identify which links are bad...I highly recommend a third party tool like ahrefs. Start with money anchors, create a checklist and do the grunt work of checking each one in a browser. I can find most based on the type of anchor text used (for my site and other sites) and the number of commercial links on the site. You can also hire a link scrubber like linkdelete. Cheap...and they'll send you a list of what they consider the worst links and let you choose which ones they'll try to remove for you. Not perfect (some legit links in our report got bad scores) but it's not a bad start.
[edited by: smithaa02 at 11:25 pm (utc) on Oct 16, 2012]
| 11:20 pm on Oct 16, 2012 (gmt 0)|
|Matt says that most sites should NOT use this tool. [pubcon.com...] |
Agreed, but maybe for different reasons to Matt's.
It could open up a can of worms for a lot of webmasters.
| 11:27 pm on Oct 16, 2012 (gmt 0)|
It bothered me too, that there is no mention of neg SEO.
A company I work for, was punished under penguin, for what appears to be 1,000 highly spammy, likely automated links. The 1,000 links all target 4-5 keywords over 4-5 pages.
Penguin hit them hard, from 80,000 uniques a day to 40,000. Yet the site is unquestionably the authority of it's niche and I'd have thought 1,000 spammy links would neither of helped nor hindered but clearly somebody is out their, laughing their socks off.
I have high hopes for this tool, but I have a feeling I'm going to be left underwhelmed.
| 11:28 pm on Oct 16, 2012 (gmt 0)|
This is a really useful tool, thanks Google.
I would reemphasize: Do not use this tool if you don't know what you're doing.
| 11:34 pm on Oct 16, 2012 (gmt 0)|
"Q: Should I create a links file as a preventative measure even if I haven’t gotten a notification about unnatural links to my site?
A: If your site was affected by the Penguin algorithm update and you believe it might be because you built spammy or low-quality links to your site, you may want to look at your site's backlinks and disavow links that are the result of link schemes that violate Google's guidelines."
| 11:56 pm on Oct 16, 2012 (gmt 0)|
Anyone have an opinion or thoughts about how this compares to the Bing Disavow tool?
| 12:04 am on Oct 17, 2012 (gmt 0)|
Okay, just sat down and watched the video and read the article... already have a question that should be cleared up by google... as per their article :
# Contacted owner of spamdomain1.com on 7/1/2012 to
# ask for link removal but got no response
# Owner of spamdomain2.com removed most links, but missed these
Q: Can I disavow something.example.com to ignore only links from that subdomain?
A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing "domain:something.example.com" will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.
Pretty open-ended... so, if I enter domain:example.com does this apply to www.example.com, test.example.com, etc. as well? From the sounds of it this will disavow the domain and sub-domain as well unless it is an 'authority'.
Do you see where I am going with this? Now, in their page examples they list www in front... what if someone redirects to non-www or vice versa... are they accounting for both www and non-www versions? By spec (I believe), and Google as well, www and non-www are 'by spec' 'different' and www is treated as a sub or separate entity if you will.
So, with that said, what is the consensus here?
| 12:08 am on Oct 17, 2012 (gmt 0)|
Add on to that www.example.com/... does this apply to /index.html, index.htm, index.php... what if they set the main page to /home.htm or something? Will it disavow the entire domain as its the home directory?
| 12:16 am on Oct 17, 2012 (gmt 0)|
My understanding is that
disavows all links from that domain.
is a different domain and needs it's own disavowal.
disavows all links from that subdomain
disavows links from that specific page.
| 12:25 am on Oct 17, 2012 (gmt 0)|
Apparently there are tweets from pubcon that claim being disavowed/on the receiving end of these requests...like Ezinearticles is bound to be...won't hurt you.
Interesting...make sense as dynamic sites like Wikipedia would be taken down in short order with disavows if this were possible. But I suspect google will still use this data somehow...someway. Maybe as a combination penalty...if you've been disavowed, stuffed your titles and you link built THEN you get penalized? Maybe they'll use it to test algorithms like how Panda was constructed? We know google loves data and they'll find a use for this new gold mine of theirs.
Other pubcon tidbit is that this works pretty much a no-follow...and perhaps that's all this is :/ ... If so, that has me a tad worried for Penguin hit sites and I don't know if you can nofollow your way out of Penguin. Also, even if it applied to Penguin does this mean we have to wait for the next data refresh and/or algo update? Google has all but indicated Penguin is about backlinks, but Cutts has been pretty tightlipped about explicitly stating this is the case. Hopefully he can open up about this and the disavow tool to help so confused webmasters.
What would be really cool is if in your webmaster tools you can see warnings where you've been disavowed by other webmasters which would provide incentive for (spam builders) to clean their own house.
[edited by: smithaa02 at 12:58 am (utc) on Oct 17, 2012]
| 12:31 am on Oct 17, 2012 (gmt 0)|
On the subject of the parth variants...I suspect you don't need to spell out index pages and the www variants. Google has bragged about being very good with their internal canonicalization lately. So they probably know example.com, www.example.com, www.example.com/, example.com/index.html are one and the same. Logically, if they are reusing the nofollow algorithm (which it sounds like they are), then it would make sense that they have this covered. I mean who worries about whether people link to which (www).domain.com(/)(index) variant with normal anchor links?
Regardless, google should clarify this to keep their databases from being swamped with duplicate entries.
| 12:41 am on Oct 17, 2012 (gmt 0)|
Nobody asked what happens to the site that the links are from when they are disavowed.
In fact, more I think about it, it seems to me the tool is briliant for Google.
All that they have to do is look which sites are mentioned mostly in "disavow from" and hey presto - a list of domains that either sell links, are low quality or are spam domains is provided to Google.
And you cannot disavow the competitor if you do not have links from that domain pointing to yours... so this addresses the false "disavow" problem in order to pretend competitor is selling links.
So watch the next round of spam link wheels and sites being nuked now...
Just read post from smithaa02 who posted whilst I was typing. I guess that well known sites as enzinearticle and wiki will be "imune" to disavow. I am not sure if not known sites will be so. At least Google has a nice list of sites to manually review.
| 1:37 am on Oct 17, 2012 (gmt 0)|
Let's be honest... they are more than likely (highly likely) to use this data somehow. If anything they can collectively see that a high number of people manually submitted a disavow for x page or y domain.
In short, it sounds like we will be doing a lot of the work for them. Although, this approach brings to the table something interesting. Now, we the webmasters, will essentially be voicing our opinion on what WE think is a bad link. Google can then run some algo or report to show high submits on certain domains, pages, ip's, whois registrants, etc. and possibly go from there to do something if they so choose.
Essentially it seems as though we are providing a manual review in essence to them.
For someone to add a disavow they would need access for a site(s) to do so and has a direct tie to the site in question or is paid to have one. One must have a reason why to add the disavow so the individual sees something wrong with it whatever that may be... essentially they feel it is doing them more harm than good. This provides G quite a good bit of data to determine that most people think a site is bad.
At the same time... you cannot add disavows to a site that is NOT linking to you (well, you can, but I would hope they check if it links or not) so it would be impossible to do the negative seo to a competitor in terms of this disavow system... assuming they use this data as a ranking algo.
Lastly, it allows a 'bad' site to remove the links others deem as 'bad' rather than ignoring emails and requests - should they choose to remove them.
[edited by: mihomes at 1:44 am (utc) on Oct 17, 2012]
| 1:42 am on Oct 17, 2012 (gmt 0)|
Since they swiped the idea from bing, they had to wait a decent interval. Otherwise it would just seem as if they swiped the idea from bing.
:: off to investigate ::
:: mutter, grumble, swear ::
Where's the Select All button?
|Anyone have an opinion or thoughts about how this compares to the Bing Disavow tool? |
Yeah. You can FIND Bing's version. How do you get to g###'s version other than by clicking the link in the blog?
| This 178 message thread spans 6 pages: 178 (  2 3 4 5 6 ) > > |