| This 50 message thread spans 2 pages: 50 (  2 ) > > || |
|What Links Should One Really Disavow / Remove?|
Multiple questions about which links to disavow:
1) If a link is already nofollw, does it really need to be disavowed / removed?
2) If a link is on a page that is "noindex" does it really need to be disavowed / removed? (for example, a forum profile page.)
3) If it is a legitimate link (meaning we had nothing to do with creating that link) by some poster on a forum, but the forum is not really related to our site (although that individual page is somewhat related), does it really need to be disavowed / removed?
4) Do we need to disavow the hundreds of sites that scrape google SERPS and link to us?
5) If a link to our site is on a "links page" on another site (which links out to a dozen or more sites besides ours), does it really need to be disavowed / removed if the site is related to us?
6) If a link to our site appears sitewide (such as in a blogroll) on a site that is related to us, should it really need to be disavowed / removed?
7) Is it ok to upload disavow files in stages? Or is it best to do it all in one shebang?
Thanks in advance. I hope we could keep this thread on topic, as opposed to bemoaning the state of the internet...
I have to say my experience with disavow is that it doesn't work. So I would try link removal, and here would be the top 5 important links to remove IMO....
1. Sitewide Links
2. Directory Links
3. Signature and Profile Links
4. Comment Links
5. Link Exchange
1: If it is on spammy or irrelevant site, contains rich anchor text, it should be in your disavow list.
2: If it is in Google's index and it does show in your backlink profile, best bet would be to add it in your disavow.
3: Check if that particular page ranks for its keywords.
4: Those links are often nofollowed, adding them in disavow won't hurt !
5: No, if it is related and is on quality site, there is no point in disavowing it. A lot of .edu and .gov sites have such pages with hundreds of useful links with in.
6: Should be highly relevant and must not use rich anchor text to link to you, should use your domain name as anchor text! Better not be doing link exchange though.
7: I would say do it once, but be very careful with selection of sites in your list.
|Multiple questions about which links to disavow: |
1) If a link is already nofollw, does it really need to be disavowed / removed?
No, but why would you not do this as a precaution, since you do not have control over the link if the status changes.
|2) If a link is on a page that is "noindex" does it really need to be disavowed / removed? (for example, a forum profile page.) |
No, but why not do the same as (1) above. How do you know what rules are in place for that site to "noindex" that page. It might change.
|3) If it is a legitimate link (meaning we had nothing to do with creating that link) by some poster on a forum, but the forum is not really related to our site (although that individual page is somewhat related), does it really need to be disavowed / removed? |
Honestly, I / who knows. Try and see it through Google's algorthimic eyes - would they see it as manipulative? Now that Google has introduced the disavow tool, the reality is that anything that you feel uncomfortable with should go.
|4) Do we need to disavow the hundreds of sites that scrape google SERPS and link to us? |
Probably. Google would argue they know the source of a document and will filter out those scrapers. But everyone knows it's way far from 100%.
|5) If a link to our site is on a "links page" on another site (which links out to a dozen or more sites besides ours), does it really need to be disavowed / removed if the site is related to us? |
Depends. Would it be seen as part of a network? Would it be seen to manipulate your rankings? Would you consider this to be from a low quality site/s?
|6) If a link to our site appears sitewide (such as in a blogroll) on a site that is related to us, should it really need to be disavowed / removed? |
Depends. Is it natural? Then again, do you think Google would see as natural even if it is?
|7) Is it ok to upload disavow files in stages? Or is it best to do it all in one shebang? |
It seems some folks who have not gone in hard enough have had to make several re attempts. That's a strategic question for you to answer, but the aim should be to clean your profile .
@Planet13 , additionally, I'd look at these [ not exhaustive and others may have better suggestions to share ]:
- check PR-n/a or PR0 sites to see if they are really quality
- e-commerce anchor text terms
- new domains, unless authority e.g. a new discovery in science
- any site with a reputation for advertorials
- domains with little traffic, unless authority
- identical C class , smells of an arrangement
- pages with a large number of external links, unless an authority
- on a page with reputation for other "dofollow" paid links
- press release articles [ the original ] - if someone picks it up and runs with it, fine.
- widgets with anchor text designed to boost PR / keyword terms
I'd also caution articles distributed via editorial feeds and widgets as Google may look at these as mass produced "link bait" schemes. Some may argue the morality/rules, but really who knows with Google, and it's all about mitigating the risk. Articles created out of the press release may not matter, provided the linking is given freely by the writers. It's all a risk that you have to assess on how Google will perceive it.
Moderation is the key, while you turn these sites around , but Google has just made things a lot harder to influence results through linking [ to put it politely ] - so who knows the magic threshold you can afford to play around with, if at all.
Google also seems to be not clear on how much effort it requires webmasters to demonstrate in link removal or 404/page removal. This is painful and often expensive, indeed not possible. It shouldn't matter, as Google already knows, which links are foul, but you don't know what Google has flagged. At the very least Google wants an admission of guilt before forgiveness to send a loud message to stop link manipulation of results.
How to submit the request, has some currently unanswered questions around the reconsideration request element ie is it necessary due to the automated file parsing that Matt Cutts and other Googlers have declared [webmasterworld.com...] . To me it's ambiguous since conflicting communication seems to exist from Google. No clarification or inputs from others on their interpretation has been posted, as yet, to that thread.
Thanks for the responses everyone. They really are appreciated.
|"Google also seems to be not clear on how much effort it requires webmasters to demonstrate in link removal or 404/page removal. This is painful and often expensive, indeed not possible." |
Yes it is painful. Almost as painful as the 35% drop in impressions and clicks since Penguin 2.0 rolled out. (According to our webmaster tools and GA accounts.)
Thanks again, everyone. Your input has helped me get my priorities straight.
Three quick points:
New #1 site in my area ONLY has comment links (all unrelated topic sites, all non branded keyword links as name)
I removed all my forum signature links and my ranking started getting worse not better
I disavowed a lot of low Domain Authority directories - no affect
Ie the opposite of what Google suggests
Natural or unnatural? Give it time.
|Ie the opposite of what Google suggests |
Do you have other links that you suspect? Did you cull the wrong ones?
Will a domain disavow “domain:website.com” address all subdomains too?
|Will a domain disavow “domain:website.com” address all subdomains too |
based on this FAQ answer, i would assume yes.
|Q: Can I disavow something.example.com to ignore only links from that subdomain? |
A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing "domain:something.example.com" will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.
I'd like to add two further types of links which are questionable to disavow or not:
- Backlinks from sites which analyze other sites for onsite or offsite characteristics, for example <snip>"dot"com or <snip>"dot"net.
Often they produce nofollow-links, but sometimes not.
- Backlinks from small unknown search engines which don't delete their serps
Some of them have high PR, some not.
Basically I think it should be a good sign for Google being found by other search engines and don't like to disavow them.
What do you think?
[edited by: Robert_Charlton at 9:58 am (utc) on Jun 26, 2013]
[edit reason] no domain names - read charter [/edit]
I brought up small search engines in another thread.
I think there are actually three types of these:
1) Scrapers in disguise
2) Real search engines
3) very niche search engines - e.g. those that only search a hand-picked set of sites rather than indexing the whole web.
2) no disavow
3) no disavow ?
|2) If a link is on a page that is "noindex" does it really need to be disavowed / removed? |
NOINDEX means "do not add to index". It does not mean "ignore all the links on this page" - that's why there's also a NOFOLLOW meta tag.
@deeper, if 2 is OK, so should 3 which are actually somewhat higher quality (more manual input for example, to pick sites in a niche).
I have just come across an interesting example of links. WHat do people think of this:
A translation site has UGC with translation of a word of phrase. It has examples of use (in ENglish) quoted from other sites with links and then the translations of those. It uses the same English text (and links) on multiple pages (with translations into a different language on each).
They also have a lot of pages that compare the same term in different languages, duplicating the English text across even more pages.
Its generally reasonable and there are 2 or 3 partially duplicated phrases per word (at least judging by what GWT ways about links per page on my site) . The system seems to have gone a bit crazy for one particular word, and there are nearly a 190k links to that page.
I am planning to disvow all links from the site. Is that the right thing to do.
While I haven't come across that exact scenario, I have disavowed various "usage" sites that are in English and define a word in English by quoting (and linking to) various sites around the web.
Most of those PAGES have a toolbar page rank of 0.
I don't know if disavowing them is going to help or not. We will see.
What I think about sites like that is:
1) They are pretty close to being scrapers
2) Pages have a toolbar PR of 0 or 1, and are pretty bad on other metrics BUT some of the sites are strong. I am inclined to think they pass negligible page rank, and any other benefits they may have are discounted or diluted as well.
So why not disavow them? The hesitation in the case of the site I mentioned above is because it is a an strong site with an Alexa rank of less than 6,000
That brings me back to a principle ColourOfSpring suggested in this thread [webmasterworld.com]: if its not clicked on its a bad link. Well I have had a tiny number of visits from that site, and the page with the 190,000 links pointing at it has had one visit with that site as the referrer in the last eight months. Now those ARE links no one ever clicks on.
|If a link to our site appears sitewide (such as in a blogroll) on a site that is related to us, should it really need to be disavowed / removed? |
Matt Cutts said in one of his youtube videos that if blogroll sitewides are natural, there are no issues.
So I guess you need to look at the other links on the blogroll, then decide whether the blogroll looks like the sort of thing an enthusiast has put together.
Speaking personally, if a blogroll looks like an enthusiast's links to related sites (rather than a list of paid links) I'd be happy to have the links.
I wonder if sites such as Wiki, BBC news, ebay have to worry about "links from bad neighbourhoods" and being punished by Google for such links that they have no control over. I bet they don't.
My advice would be to remove anything that doesn't look natural or editorially given, is irrelevant to what you do or all out looks low quality or paid.
Low quality links are links on websites that have low authority factors, low inbound links, low social metrics or plain low quality content. Getting good with tools you can learn how to assess these factors very quickly and make distinctions.
It's better to remove than disavow.
Begin actively, monthly, monitoring your back links and staying on top of new links in the same fashion.
Always have a plan to replace lousy links with high quality naturally given links.
What about backlinks from sites where you can analyze your own website or competitors for onsite and offsite characteristics?
Typically they show title tag, description, kw-density, IP, whois-data, internal links, number of pages...
Some of them have strong domains and millions or hundred thousand pages indexed, but the linking page usually has PR "unranked".
@deeper, that is similar to the sites Planet13 and I discussed above: high PR home page, very large numbers pages indexed, good backlink profile etc. BUT low PR and a lack of original content on the pages that link out.
If anything these are worse as the content is auto-generated, with no human input.
Planet13 has disvowed and I plan to unless I can find a reason not to, so I think this sort of site should be disavowed two EXCEPT that they are so easy to Google to spot they have probably already been discounted.
According to Matt Cutts and John Mueller in a variety of hangouts/videos:
1. Try to get links removed only if you have a manual penalty. Otherwise, use the disavow tool.
2. The disavow tool is intended to be used only for links you've built yourself/had built on your behalf.
3. If you have links you don't like the look of, which weren't built by you, feel free to disavow them if you like, but this isn't the purpose of the tool.
You should listen more careful...
Is it possible for you (or anyone else) to provide any references / links to these statements by Mueller and Cutts?
Thanks in advance.
I will look for a link, but I am pretty sure that MC has said to use the disavow tool for sites that are in an algorithmic link situation (i.e., Penguin), along with link removal.
Which suggests that its usually not necessary to disavow links from scrapers and other sites with auto-generated/duplicate/copy and paste content.
The question is whether Google is as good as Cutts claims it is at spotting these sites.
To be honest, I am now completely confused about when its appropriate to use the disavowal tool other than 1) dealing with negative SEO and 2) reversing you own spam links.
Well, don't know if this will help you decide, but I just submitted a disavow file and I really only included sites that:
1) Had a blog comment from us, or
2) Had a forum profile page from us, or
3) Was a directory with a link to us, or
4) was a link exchange partner, or
5) was a scrape of one of the above sites.
I also tried to remove links from those sites, and even where I WAS able to remove the link, I still disavowed the link.
I didn't bother with all the sites that scrape google results, nor the ones that scrape wikipedia (since wikipedia links to us several times) nor the ones that scrape suite101, squidoo, ehow, chacha, and all the other sites that link to us (since we never used any article marketing on those sites to link to us - it was just the authors of those pages, whoever they were, who acknowledged us as a reference).
|"To be honest, I am now completely confused about when its appropriate to use the disavowal tool other than 1) dealing with negative SEO and 2) reversing you own spam links." |
I think now it is probably best to focus on #2. Remove as many of those links as you can, disavow them as well even after they are removed, and file a reconsideration request EVEN IF YOU DON'T HAVE A MANUAL PENALTY and just a Penguin problem.
Google has told several times very clearly, that a RR is ONLY suitable with manual penalties, not algo penalties.
btw, convincing Google from your "good will" by removing as many bad links as possible before disavowing will only succeed if you give Google crawlers enough time to realize your efforts. Therefore you should wait at least two weeks (?) after deleting before disavowing.
Do I think wrong?
once the link has been removed and the url containing the removed link has been recrawled, i'm not sure how a disavow could be beneficial.
I appreciate the fact that you are trying to help and are making suggestions in order for me not to shoot myself in the foot.
|"Therefore you should wait at least two weeks (?) after deleting before disavowing." |
|"While it's really important for the web-spam team when processing the reconsideration request to see significant effort put into resolving the issue at its roots (on those external sites), it can also be a good practice to at least have those sites listed in your disavow file in the meantime (use them in parallel, don't get bogged down with contact requests before adding them)." |
I am assuming this means it is ok / recommended to add links to the disavow file even before the link had been removed (or within a short time period after it has been removed).
"Google has told several times very clearly, that a RR is ONLY suitable with manual penalties, not algo penalties."
That may be.
On the other hand, the cases where I have read about a success from using the disavow tool for sites affected by Penguin were sites that filed a reconsideration, despite NOT having a manual penalty.
The people who experienced this theorized that the disavow file only has an affect on rankings either 1) during a Penguin refresh, or 2) after filing a reconsideration request.
One person who claims to have experienced success using the disavow said it was during a PANDA (not Penguin) refresh.
| This 50 message thread spans 2 pages: 50 (  2 ) > > |