Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Your favourite tool for link audit on one site?

         

FranticFish

9:58 am on Oct 1, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



* note to mods this topic doesn't seem to have been addressed in a few years, hope an exception to normal rules is OK *

I am reasonably familiar with link data providers and have - to now - always used them for competitive research. Indeed, nearly every offering I've seen is built around the concept of running reports on a number of domains rather than just ONE domain.

I am looking at a rebrand and domain move situation where a domain point / global 301 is not really an option. Factors include link building techniques employed 15 years ago when things were very different.

If mods allow, I would opinions on a tool that allows a deep dive into the link profile of ONE domain. And please no-one say GWT - of course we use it, but I'm hoping to identify a data provider that doesn't obfuscate their data :)

tangor

11:00 am on Oct 1, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What is it that you want to audit?

Links work?
Mismatched?
Dead End?

Curious minds and all that. :)

RedBar

1:33 pm on Oct 1, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm slightly puzzled too, do you simply want to check the new domain works correctly?

FWIW I do not use anything other than checking stuff on my desktop .. I assume I am misunderstanding your question?

Pjman

1:53 pm on Oct 1, 2020 (gmt 0)

10+ Year Member Top Contributors Of The Month



I haven't seen anything that is better than Ahrefs for link analysis. If anybody has others, love to hear it.

FranticFish

2:09 pm on Oct 1, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks for all replies. I'll try to clarify.

The purpose is to see which domains links to X domain where X is my client.

The way most link data providers work is this:
- they have plans which are renewable monthly;
- each plan allows analysis of their dataset for of a certain number of domains;
- for each domain, they will show you who links to that domain (that they are aware of);
- the depth of the analysis for each domain is also a factor.

So a plan might allow you to query their database for the first 5K inbound links they know about for 25 domains.
You'd need to upgrade to see more then 5K links, or to run reports on more than 25 domains.

It varies, but that is the general concept.

In this instance, I would like to try to track down every possible linking domain for just ONE domain. I am aware that the link discovery bots of these data providers can be blocked, and I've also read that Google may well know of way more links than even the biggest providers

I'm aware of Majestic, Ahrefs, MOZ and also aggregators such as Cognitive. Looking for any other services people use but the emphasis is on
- identifying possible toxic links; and
- discovering as many possible linking domains for just the one website.

tangor

9:30 pm on Oct 3, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ambitious! Given the nature of the beast, these efforts might produce partial results UNLESS you (or your sources) have the horsepower of Bing or G to hit "all the web---all the time".

Then again, any info is usually better than no info!

I do find my raw logs will produce that kind of info pretty reliably for users who actually arrive from back links ... but I know I will never have a complete list of who has linked to me if nobody clicks on that "other link". Which means, regardless if there's a back link TO ME from example.com, if it is misplaced or their users have no interest, it means nothing, to me or the search engines!

FranticFish

7:47 am on Oct 4, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ tangor - good point and relates to the 'quality' issue from my original post.

I am fairly certain the legacy domain has a number of questionable links from the early 00s, basically 'SEO-friendly' directories. This is why a simple 301 is a risky option IMO. I've seen first hand that there is a sort of amnesty for dodgy links but that new link activity will end the amnesty. I think it possible a global 301 would lead to a reconsideration, end the amnesty, and damage the new domain's reputation.

However, I don't want to throw away IBLs that aren't being clicked on. I agree that those are the only ones that should really count, but I don't think that's how things actually work yet, and I can't afford to miss links that could help.

Another factor is that time is an issue and we don't have the weeks / months it would take to contact all the webmasters for all the good links and get them moved.

Currently my thinking is along these lines
- identify the total number of links (or as many as I can)
- put them into two camps: (i) disavow; (ii) keep
- disavow the dodgy ones
- then implement a 301 and upload the same disavow file on the new domain

But I'm not 100% settled, still trying to see the angles.

Bottom line is that I need as complete a picture of links as I can get. Warts and all, the bad as well as the good.

not2easy

11:51 am on Oct 4, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Have you looked at Screaming Frog? I often see it mentioned for evaluation, checking, analysis in discussions. I have not used it but see it mentioned and suggested frequently.

From earlier this year: [webmasterworld.com...]

Robert Charlton

10:08 am on Oct 5, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



FranticFish, if I'm understanding your situation, this is a domain you or a client controls, and you want to check it out thoroughly before rebranding, to get rid of anything that might give you problems later on.

I'm thinking that, in effect, this situation may be essentially the same as doing a pre-emptive disavow analysis. If it's not, then I'm still not understanding your question.

If it is, I myself would strongly recommend Majestic, as if you were using it for an unniatural links analysis... and I'd use it together with GWT. You've described why you don't want to use GWT alone, but for fresh links, as I remember, GWT can include links you won't find elsewhere.

That said, in my experience, Majestic (at least several years back), could spot backlinks for sites which I did not control, and spot them much faster than the competition could. To me, this suggests that (apart from Google) it had the freshest database. I can't comment, though, on, eg, ahrefs or moz now, because it's been a while. Majestic should have the oldest pubicly available database, as it's been around longer than the competition. I haven't kept track of which tool suites license Majestic's data and what they do with it.

It appears that you don't want to end up paying for more functionality than you need, which is very understandable. I should point out, though, that any complete link database is *necessarily* going to have data about links going both directions, as that's what they see when you spider the web. Even if you're not searching now for link building possibilities, some of that data about sites at the other end of a link to you will be useful for your purposes.

Majestic's Trust Flow and Citation Flow, while different from Google's PageRank metric, might arguably be more useful to you in deciding which links to you are the most valuable, and indicate which pages you want to keep and which you want to drop. They are metrics that I trust have been designed with many thoughts beyond the spammy link building norms of ten years or so ago, which "algos" until recently the competitve tools were emulating... and Majestic reports sitewide links, overly aggressive anchor text, etc.

For a while, when Jim Boykin became owner and custodian of WebmasterWorld for several years, we had the use of Jim's disavow templates, and I think I remember also his backend analysis flagged things like an overabundance of .ru and .cn TLDs. I don't know how much of this Majestic does now, but if you can group sides by TLD, they shouldn't be hard to filter. In general, let the Google Historic Data patent plus your experience be your guide.

Additionally, coming up if not already here, niche specificity and various kinds of trust metrics are factors you will need to consider in linking.

Tne days of repurposing expired non-profit dot orgs to become portals for job lead sites, eg, is, I hope, long gone.

I should mention also that way high up on the list of effective tools is the Link Dtox tool from Christopher C Cemper's Link Research Tools. It may be overkill for what you're doing, but if it's the basis of a rebranding it may well be worth it. I've never used this tool, but it's got an extremely good reputation among people I trust.

As for the subscription model, which works for agencies but can be difficult for solitary SEOs, that's mabye another discussion, but it's a problem for me as well, so I do understand.

Robert Charlton

10:28 am on Oct 5, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Mod's note: We don't ordinarily permit mention of SEO tools in this forum, but we're making an exception in this thread.

I need to emphasize, though, that our forum Charter also asks new members ("New User" status) not to post specifics or links until they reach Junior Member status. This is done to cut down on the spam, and we will continue to observe that guideline.

In other words, new users may not make recommendations in this thread. Thanks in advance for your cooperation.


FranticFish

9:14 am on Oct 6, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ not2easy - thanks for suggestion. Could be a nice tool to expand on tangor's advice to check raw logs, as it looks to be a spidering tool. Heard good things about it. However it's for spidering YOUR site (possibly helping you make sense of server logs re: referrers) and that's not quite what I need.

@ Robert Charlton - thank you so much for such a comprehensive reply. Yes, it's exactly as you state in your first paragraph.

I was edging towards Majestic myself and wondering if I could wangle a way of consolidating my allowance on multiple domains into one domain.

I have typically used MOZ and Cognitive for competitive link-building research, and have my own tool that I use to mash their metrics together for reference, although I always use my own impression to make the decision to approach for a link. I find the eye knows more than anything else. Awful sites can have great metrics and vice versa. But your advice that TF/CF are going to be more useful here than any other metric is sort of what I was thinking.

Thanks for the advice on the disavow templates.

Thanks also for the mention of Chris Semper's offering. I have been aware of that for a while, but the recommendation is noted. Whilst I was impressed by the setup when I checked out the tool, I found Chris a bit too cagey about his data sources to take out a plan and try it first hand. This is probably 8 years ago. I had been using Raven Tools for competitive link research and getting fed up with their workflow. Having demoed the tools I could, I decided to build my own so I could control the workflow from data gathering through mashing it together and building a prospect list for outreach. Checking out all the different providers, some would try to hide where they got their data from. Did they have their own spider? Or were they reselling other people's data? Back then Ahrefs were not that long up and running and it was pretty much Majestic and MOZ. There were small ISPs that were also selling their data, and there were other sources.

LRT had a major USP which was 'data that no-one else has' but then refused to say what that data was. I did a round robin of all the 'data providers' and emailed Chris trying to understand how his offering differed but got no firm answers about exactly where the data his tool used came from, which put me off. From his point of view he was trying to protect his competitive advantage. From my point of view that's me making a blind purchase - and also IMO just not the way to look at it.

Razvan from Cognitive was far more forthcoming because he recognised (correctly) that I was NO threat to his business model whatsoever. He told me exactly what data he used and how he put it together, knowing full well that even if I wanted to copy that (I didn't), I'd still be months/years behind his dev cycle. Heck, even if I had the money to put something out within 24 hours I'd still be 6 months behind him. No threat.

Anyway, I digress. Thank you for such a thoughtful reply. Lots to think about :)