Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Why are these competitor rankings so high?

         

jediviper

8:39 am on Mar 6, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



So the usual guides for ranking at Nr.1 pos, show always the same things.
Check your competitions with Ahrefs (or other seo tools) and check how many backlinks or how much DR/DA they have etc and try to do something similar.

Well at this image u can see that this rule is not accurate. Or the data from Ahrefs are not correct.
[drive.google.com...]

  • The Nr.1 result here has one of the minimum number of backlinks and ref. domains
  • It has similar DR but worse UR than the 4th result. This 4th result still has thousands more backlinks.

    Do u have some other explanations?
  • Robert Charlton

    11:53 am on Mar 6, 2020 (gmt 0)

    WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



    So the usual guides for ranking at Nr.1 pos, show always the same things.
    Check your competitions with Ahrefs (or other seo tools) and check how many backlinks or how much DR/DA they have etc and try to do something similar. / / Well at this image u can see that this rule is not accurate. Or the data from Ahrefs are not correct....
    jediviper, the simple answer is that Ahrefs is not Google.

    This is the first time I've heard this process called a "rule". If you break the process down, it is a misplaced belief in correlation. I'd call it a superstition, a mistaken belief that these tools have successfully reverse-engineered the Google algorithm. I'm guessing that most of the tools also have not modified their original set of correlations and kept them updated as quickly as Google's been making changes in its own ranking criteria.

    I'm curious... does the process as you see it suggest that the different metrics of all the tools should roughly correlate with each other, that an seo using Ahrefs for its backlink evaluations should do as well as a site that used say "b-Rush" or a different tool/dataset combo?

    I do feel that you're surely selling the SEO process short if that's all that's done with it... or if you think that's all there is to Google.

    There is data to be extracted from some of these tools, and there are some semantic SEO tools that are helpful now... but the real optimization IMO is in designing a site where that material keeps you totally absorbed... with stuff you've never seen before.. Eg, the new YouTube interface, now often with really good videos, is suggesting how absorbing a site can be... hooking me for more hours than I'd like... I'm genuinely learning stuff from those videos. Now to scale that expansiveness of YouTube down to my own sites.

    I sometimes find that, say, SEMrush gives me a rough guide to basic PPC keyword search frequency. If people make money or break even paying for certain phrases year after year, then the targets can't be too far off.... But the shape of queries are evolving quite a bit, and PPC and organic are different from each other. In organic, for me anyway, it's no longer one, two, and three-word phrases with different stemming and single/plural variants. On a business site, say, I feel you need to paint a portrait of what the business does, where it can be helpful to the user and make them want to do business with you. You need to be trying to blow people's minds about how interesting that material is to them... not putting your faith into a random pile of third party metrics.

    I've been searching Google for a while now with freeform questions, mostly for research, and sometimes results are pretty good, sometimes way off, but they get better with more and more data... and sometimes I'm amazed at the results I find... But I have enough of a gut sense of how search works to know what changes to make as I explore a query. I spend a lot of time searching. Targeting the casual searcher should be even easier.

    I also look at why competition ranks... really check out what I like, what Google likes, and what works for competitors ... and if I imitiate anything, it's what the best developers are building... not the number of links that they've gotten somehow. And I spend lots of time learning the business model, and working with the client about how best to make that work. I assume the links, with a bit of promotion to get traffic flowing, will take care of themselves.

    So, where does that leave the third party tools, where you might not even be looking at the sites... but rather just comparing DR scores on a chart... and I don't think that tells you much?

    Returning to your original strategy, to those rote rules you use... if Ahrefs had its own search engine, and you used Ahrefs to analyze it, chances are you might do better than you would on Google's search engine.

    ---

    Here's a thread from last year about third party tools, and how Moz improved its DA metric. It was a very flawed metric at the start. In the thread, martinibuster really tears apart their early approach, which was their approach until very recently, and goes into the kinds of changes they've made.

    Moz Domain Authority 2.0 Update on 5 Mar 2019
    March, 2019
    https://www.webmasterworld.com/google/4937759.htm [webmasterworld.com]

    From what I'm seeing, Google has been leap-frogging to approaches of site classification that may be putting all the tools behind.

    Sorry to go on so long, but your question is a good one.... I'm guessing as I read the monthly SERPs and Updates thread that most posters there aren't even looking at Google when they optimize... they're just reading reports, but, in the end, they're not happy with the results.

    jediviper

    12:55 pm on Mar 6, 2020 (gmt 0)

    5+ Year Member Top Contributors Of The Month



    Well I believe that what these SEO tools are doing is finding patterns. Sometimes these patterns make sense and some other times they do not.
    In this real example of a popular kw I think that it's obvious that the backlinks is NOT what is pushing the first domain to remain at the Nr.1 position. And since I am following this ranking list for quite some time, I can also confirm that it's not a -one time- thing. The domain at the first pos is there for around 1 month and was steadily growing to get this 1st pos.

    For me as an SEO, it's more about understanding what is the reason that brought this domain to success and made it overpass all the competition. And I can confirm that it's not because of the nice design of that page either.

    lammert

    2:27 pm on Mar 6, 2020 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



    Many factors can play a role. One is the size of a site and link-taxonomy. Are the incoming links diluted over a large non-specific target area, or are they pointing directly or indirectly to highly on-topic landing pages?

    Other factors are the way users interact. Metrics like bounce rate, average pages per visitor, average time on site, source of visitors (search engines, browser bookmarks etc) are all factors that Google could be using as signals to adjust the SERPs.

    divergence

    7:09 pm on Mar 6, 2020 (gmt 0)

    5+ Year Member



    Don't overthink this, you are forgetting one basic fact ...ahrefs honors Robot.txt

    All a smart strategic competitor has to do is enter this in their robots.txt file
    user-agent: ahrefsbot
    disallow: /

    and then the competitor's stats freeze in time on ahrefs. The only current data ahrefs will have updated in its report on are the domains backlinks. So 6 months later all data in ahrefs will be old and not provide you any information worth using.

    PS ahrefs, moz, and majestic are not the only analytics in town and they all honor the robots.txt file.

    lammert

    7:23 pm on Mar 6, 2020 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



    Thanks for the insight and Welcome to WebmasterWorld divergence!