Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

What metrics to use when evaluating competition?

         

goodroi

6:14 pm on Feb 3, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



What metrics do you track and pay attention to when analyzing domains & web pages? How is the metric useful to you?

I'll be honest I still glance at the toolbar PageRank. Yes, I know I should be embarrassed :) Yes, I know it is out of date. Yes, I know its been over 10 years since it had a decent correlation with rankings. It is just a super easy starting point and you need to start somewhere not to mention its sentimental value. Once I regain my senses I fire up my analysis tools that look at the metrics that have a fighting chance of predicting SEO power.

I like looking at referring domains, anchor text distribution, size of site, c class ips and other metrics. Personally I find it rare that a single metric is sufficient for what I need to answer and often use many different metrics to better understand what is going on.

adder

7:11 pm on Feb 3, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Good topic. People tend to overlook competitive intelligence.

In my experience the best competitive overview is one that is paired with keyword research and done on at least 10 potential competitors at the same time.

In most cases you want to evaluate the market share or impact of your main competitors, and you're right, one or two "SEO metrics" won't give you an answer.

When I've identified 10 or so competitors, I use a tool that shows me what keywords each competitor targets organically and on PPC. There are half a dozen tools like that out there and half of them are rubbish but I'm sure you know which ones I mean.

Depending on how well I know the niche, this step also allows me to estimate how much $$$ they're making.

Metrics-wise, the metrics introduced by the two largest backlink spiders are increasingly helpful. I mean TrustFlow, CitationFlow and DomainAuthority.

Although, again, on their own without looking at the whole picture, they're probably not as helpful. I recently found a site that had DA9 and not a single backlink. Zero. Figure this? For fairness sake, the higher the DA is the more accurate it is.

Similarly to yourself, I'll also look at unique IPs, Subnets and Domains linking back. Not only it shows how SEO-ed the site is, it may also show spammy tactics.

Also with size of site. I agree, it may be a good metric in the context of niche research but equally, digging deeper may help you discover competitors' vulnerabilities that you can exploit. I always scan competitors' site with XENU (free tool & no affiliation, dear moderators) - helps you discover on-site issues. Basically, if you're evaluating potential niches, you want to be looking for places where the main competitor sites have shortcomings. It means you'll have to put in less effort in getting ahead of your competitors.

It's also interesting to compare the number of pages according to XENU with the number of pages indexed by Google. It may uncover sites with content issues.

I still glance at the toolbar PageRank. Yes, I know I should be embarrassed :)

Haha, I think we all have some old habits. I still check Alexa if I'm evaluating several sites within the same niche and all Alexas happen to be within the top 500,000 range. Of course, I don't make any decisions based on that. Just curious :)

aakk9999

7:22 pm on Feb 3, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



In my experience the best competitive overview is one that is paired with keyword research and done on at least 10 potential competitors at the same time.


Agreed!

I think that the most important is to analyse and conclude who the site's competition is as it may be different to what it may seem looking at a few main keywords.

Simplistically, we derive a list of competitors using the shortlist of nnn content relevant keywords based on the keywords research. For each keyword in the shortlist, we record the top 10 Google.com SERPs results, resulting in a list of websites that rank on page 1 of Google.com SERPs for these keywords. These websites therefore represent the strongest sites competing for any of these keywords in a shortlisted keyword list.

We then rank these competitors based on the number of keywords they have on page 1 of SERPs. Sites with most keywords overlap are the strongest overall competitors.

Once the shortlist is made of who the top 10-20 real competitors in SERPs are, we then use pretty much the same what Goodroi said: external backlinks, volume and diversity of anchor text used in external links, refering IP addresses, referring Class C Subnets, Google PR, domain age etc and try rank the site being analysed within the list of competitors.

netmeg

8:09 pm on Feb 3, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I honestly don't look at TBPR or even links, subnets, domains, or any of that stuff.

Most of my client work is B2B Ecommerce, so I look at the competition more from a business standpoint than just a web standpoint.

I look at various things in SEMRush, like their keywords (and how any overlap with ours), I look at estimated traffic for those keywords, and how much of their traffic is derived from which keywords. (One unfortunately named competitor is getting 59% of his traffic on a single word - because he named his company after a child's toy, which has nothing to do with what he does or sells) I look at a year or two of keyword position and traffic trends to determine if he's going up or down in Google. I'll look at which of their pages is ranking for a core competency product, and compare it to ours.

I go look them up in Bing and Google to see if they're running PPC, and whether or not they're using product / shopping ads.

I comb through their order information, return policy, shipping information - all that stuff, looking for things areas where they excel or come short.

I want to know what trust signals they're using, and whether or not they're a Google Trusted Store.

I'll go look at all their social, and see if they're really using it properly, or just hired some firm to "manage" it for them, which usually means all broadcast and no engagement. I want to know if they have unhappy customers trying to engage with them.

I'll look for reviews, positive or negative.

If I really want to dig into them, I'll run a screaming frog spider on their site and look for issues there too. Are they loaded with duplicate content, duplicate page titles and duplicate meta description tags? I want to know that.

I'll probably run a couple page speed tools on 'em.

For one client, a lot of his competitors steal his images, so I'll go into Google Image Search and look there too.

I might also go into the internet archive and look at previous versions of their sites.

I'll look at which platform they're using for their shopping carts, and scope out their navigation and how they've architected their site (yes I know that's not a verb).

I've been known to place small orders just to see how their checkout process works.

And that's just the first hour.

Basically, I like to think of myself as one of the velociraptors in Jurassic Park, testing the perimeter for weaknesses I can exploit.

I think this sort of thing gets me closer to what I need to know than pagerank, domains or links.

But that's just me.