Welcome to WebmasterWorld Guest from 18.207.136.184

Forum Moderators: DixonJones & mademetop

Message Too Old, No Replies

Good Page Rank Alternatives

Now Google has killed Toolbar Page Rank, what else might you use?

     
6:50 pm on Oct 9, 2014 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator dixonjones is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 8, 2002
posts: 2945
votes: 25


Google Toolbar Page Rank (GTPR) will most likely not get update any more according to google [webmasterworld.com]. Whilst this was of limited use, due to its slow update and limited 1 - 10 scale, there remains huge merits in being able to quickly evaluate the "worth" of a page or site.

So what other tools can you use for this?

Way back in May 2012 I published some research showing correlations between PageRank and the popular metrics from Moz and Majestic. Three metrics correlated well:

1: Domain Authority (0.787 correlation)
2: Trust Flow (.746 correlation)
3: Citation Flow (.814 correlation)

In the comments, further research from Rand's team showed
4: mozTrust correlate at 0.66
5: Page Authority correlate at 0.68

Why Page Rank is not as good as Trust Flow or Page Authority as a Quality indicator
Citation Flow correlated better than Trust Flow in my test because, I believe, both calculations start with an absolute value for each web page, based in Google's case on link counts according to the original PageRank paper (I am happy be corrected) and in Majestic's case on either IP counts or domain counts (I forget which now). In other words - modern self replicating systems can inflate these counts artificially. Because Trust Flow uses similar methodology, but weights counts based on their proximity to known trusted sites, Trust Flow is less prone to artificial manipulation - although Google obviously have their own less public ways to avoid manipulation. Page Authority also effectively does a similar "proximity to trust" algorithm, but using an entirely different methodology as it is based on Moz's ranking data... A site is trusted by search engines, Ergo it is trusted in Page Authority. (This is a methodology Majestic cannot replicate as Majestic has never been involved in rank checking).

Moving to 2014
Over the intervening years, I have been getting periodic notifications from a friend on the correlations with Page Rank, though often with limited sample sizes. If you want the strongest correlation with Page Rank, then Citation Flow looked to be the best back then, but more recently things have changed with Trust Flow being closer in more recent tests, but if you want the strongest metric of QUALITY then I think you should look at Trust Flow or Page Authority.

Here are some newer correlation tests on a smaller data sample:

Ahrefs DR: 76.32%
Trust Flow: 72.79%
Citation Flow: 70.99%
Webmeup: 97.03% - This last one looks SO close to PageRank that I can only assume the metric largely IS PageRank taken from the GPTB itself. If that is the case, then you can expect that correlation to deteriorate pretty quickly now that the data is not being updated.

I should point out on this last table that the sites analysed were all "Fairly decent" sites in a narrow vertical. I have noticed that the different metrics on offer react very differently to "outliers" - sites that are either very good or very poor. The original sample was quite a few sites but (importantly I think) generated randomly for the purposes of the test.
1:02 am on Oct 10, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


Since I block these and most other parasites that leech their biz models on the worth of legit web sites, they are by definition worthless to me.
1:19 am on Oct 10, 2014 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator dixonjones is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 8, 2002
posts: 2945
votes: 25


That's in interesting argument... Google takes your entire content, not the link graph, so who's the leech? But most obey Robots.txt. Majestic at least also obeys Crawl delay if you are really paranoid.

In any event, you blocking these crawlers is your choice, but it doesn't mean that your pages cannot be evaluated using these tools. I equate it to a road map - just looking at the map, you do not need to GO to (say) Los Angeles to see that it must be larger than Dallas.
1:46 am on Oct 10, 2014 (gmt 0)

Preferred Member

5+ Year Member

joined:Mar 27, 2010
posts:423
votes: 0


But if LA blocks your map, you won't be able to know how big it is. You'll see a name and think it's a tiny village which it is not. That's the point... There was no blocking PR, but there is a possibility to blocking other services.
2:01 am on Oct 10, 2014 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator dixonjones is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 8, 2002
posts: 2945
votes: 25


Not quite true - if LA blocks the map, you see lots of roads going into LA. The only thing you cannot see very well is how the internal roads connect, but you still know how LA is larger than Dallas by the size and quantity of the roads going into it.
8:29 am on Oct 10, 2014 (gmt 0)

New User

10+ Year Member

joined:Aug 20, 2009
posts: 18
votes: 0


It'd be mozrank & moztrust for me. ( personal choice )
9:50 am on Oct 10, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member fathom is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 5, 2002
posts: 4110
votes: 109


DixonJones wrote:

In any event, you blocking these crawlers is your choice, but it doesn't mean that your pages cannot be evaluated using these tools. I equate it to a road map - just looking at the map, you do not need to GO to (say) Los Angeles to see that it must be larger than Dallas.


Contrary to your opinion, you have to have an understanding of both Los Angeles & Dallas to start any comparison. Where did the map come from to "not go" but "just look"?

When your tool returns "this domain has no links" is that because it actually has none?
11:06 am on Oct 10, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


A few sneak in under the radar before they're detected & blocked. Still others steal that data from those that get through. I'm just saying that I categorically block any resource that builds its business model on scraping content or analytical data from my business.

I could care less if my site shows no links at these parasite tools. Most are obsolete and/or highly inaccurate which makes them irrelevant anyway.

That's in interesting argument... Google takes your entire content, not the link graph, so who's the leech?

So who's defending Google?
1:21 pm on Oct 10, 2014 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14923
votes: 492


I could care less if my site shows no links at these parasite tools.


Folks... this is a myth. :)

Blocking Majestic will not keep your backlinks from their graph. It will only keep Majestic from knowing your outbound links. To keep Majestic from knowing your inbound links you would have to persuade every site that links to you to block Majestic.

There's no downside to blocking third party crawlers. But there are no upsides, either. If these crawlers are slowing down your site the problem is not the crawlers it's your server.
1:39 pm on Oct 10, 2014 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14923
votes: 492


Third party metrics tend to measure ranking metrics. A site that is not trying to rank for anything will tend to show a low score.
3:30 pm on Oct 10, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member fathom is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 5, 2002
posts: 4110
votes: 109


Blocking Majestic will not keep your backlinks from their graph. It will only keep Majestic from knowing your outbound links. To keep Majestic from knowing your inbound links you would have to persuade every site that links to you to block Majestic.


Interesting... Funny how some websites seem to have almost no backlinks they must be great persuaders.
4:22 pm on Oct 10, 2014 (gmt 0)

Full Member

10+ Year Member Top Contributors Of The Month

joined:Jan 3, 2004
posts:321
votes: 39


Nothing will replace pagerank, these other sites may show good numbers due to inbound links when Google could have banned the site for the same inbound links... think about it.
4:31 pm on Oct 10, 2014 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14923
votes: 492


Funny how some websites seem to have almost no backlinks...


That's because third party crawler data is incomplete. Never assume you're looking at the complete set of backlinks. It isn't. Eric Enge estimated last year that the data was about 30-50% of the actual total.
10:54 pm on Oct 10, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893




...blocking third party crawlers. But there are no upsides, either.

Wrong. The upside is they don't get to scrape my content/analytics and gain from it.
2:28 am on Oct 11, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:May 16, 2014
posts:141
votes: 0


While the least impressive use of Majestic is to synthesize a quick snapshot of the PR "worth" of a site, I find I use it quite often when evaluating sites I own. Applying different test conditions to those sites lets me see, in close to real time, changes to the Majestic metrics.

Kelowna also points out that it isn't PR and I agree, but it does provide that window to the linkgraph I sometimes like to see.

Overall, I believe Trust Flow is significant, but it is only part of the personalized/localized/intent determined/relevance rated/viewport ranked/A-B tested search results.
2:36 am on Oct 11, 2014 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14923
votes: 492


I would be much obliged if you explained about gaining insight about your analytics. Not sure what you mean but would like to know.

Majestic has a page about the benefits of letting them crawl [blog.majestic.com] you it didn't convince me. Am I missing something? I would love for Dixon to explain how it would benefit me, you, us to allow them to crawl us.
3:06 am on Oct 11, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:May 16, 2014
posts:141
votes: 0


Test sites. Apply certain conditions(different inbound link types), watch what happens between Majestic metrics and SERPs. If it looks promising, apply similar conditions to a "better" site for further evaluation.

I wasn't promoting letting Majestic crawl, I just don't know of any real benefit in blocking them. Personal choice decision, IMO.
7:06 pm on Oct 12, 2014 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator dixonjones is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 8, 2002
posts: 2945
votes: 25


From Roger:

I would love for Dixon to explain how it would benefit me, you, us to allow them to crawl us.


Some of you will know that Majestic has a search engine in Alpha (majestic.com/reports/search-explorer). Unlike other search engines, Majestic currently succeeds WITHOUT taking and storing all the on page content.

Webmasters are increasingly frustrated that their business models are being eroded by the major search engines seeking to exploit the knowledge extracted from their websites to expand its reach and ultimately deliver "value for its users" by never having to send its users to your websites AT ALL.

Majestic's model will offer a real alternative. Majestic's search engine may not be perfect yet, but it has an API. That means that when it is out of Alpha, we will be able to distribute the API commercially. It can be used by other businesses that would rather be independent of the two main search engines.

The comments have gone somewhat off the point of the post, so let me bring it back by saying that the development of Trust Flow at Majestic has been a vital element in developing what we have so far. If a page's Trust Flow is good... that doesn't make it "Good in Google's eyes"... it makes it "Good in Majestic's eyes". The correlation studies simply show that it's not far off the mark.
8:47 pm on Oct 12, 2014 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14923
votes: 492


I have reservations about the use of Trust as an SEO metric. My feeling is that Trust as a ranking metric or as a quality metric has been discussed more by SEOs than by search engineers. Any time I have read or heard search engineers discuss trust it was within a different context, as in, trustworthy sites, sites we can trust, trusted sites (which can mean that a site does not exhibit qualities of spam).

There is more. I will elaborate when I have more time. ;)
2:55 pm on Oct 13, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2003
posts: 4319
votes: 42


I have done a lot of backlink analysis over the years and sites seem to go against the order I would think they would be in based on backlinks. I have seen many times when some tiny site with almost no links beats out others.

It almost seems the best off site SEO is no offsite SEO. Build a good site add real content written for your visitors and eventually you will be permanently in the top 5. The time period would be 2 years to do that. This would have to be a real company that exists outside of organic results to do this.

I'm sure this does not work in all sectors and online only companies. It could if you spent a lot on PPC to build up the company.
3:06 pm on Oct 13, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2003
posts: 4319
votes: 42


Martini search engines may not have a "trust" factor but the factors they do look at end up being a trust factor.

By default having a top ten list means you are saying you trust these top ten sites are the most relevant. Especially when they leave out a website that beats their algo except for the fact that they have been deemed untrustworthy. They have decided that some website will not be included even though everything points to the fact that it should be other than the fact that they have decided it has done something that make it untrustworthy and because of the way they do this they do give that a value of how much they don't trust somebody.
3:31 pm on Oct 13, 2014 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14923
votes: 492


...search engines may not have a "trust" factor but the factors they do look at end up being a trust factor.


Thanks, that is largely what I was trying to communicate, only you did it more clearly and directly. :)
2:39 pm on Oct 14, 2014 (gmt 0)

New User

5+ Year Member

joined:Jan 5, 2011
posts:9
votes: 0


Does anyone have any ideas about how to check whether a site has been penalized without the PR tool? The only thing I was using PR for anyways was to check whether the site had its page-rank reset to 0..
2:44 pm on Oct 14, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2003
posts: 4319
votes: 42


@theblackout The first thing I always do is search for the title of the site with quotes in Google or a sentence on the front page in quotes. If you don't show up for that there is a very good chance that your site has a penalty.
2:54 pm on Oct 14, 2014 (gmt 0)

New User

5+ Year Member

joined:Jan 5, 2011
posts:9
votes: 0


@ogletree Good solution, thanks.
3:27 pm on Oct 14, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2003
posts: 4319
votes: 42


The next step is to look at backlinks using Moz and/or Majestic. There are also some other websites out there that have penalty checkers which I'm sure you can find in Google. Of course the best way to tell is to look at analytics. I sometimes compare link charts in Majestic with traffic data to see when and what the problem might be.
4:13 pm on Oct 14, 2014 (gmt 0)

New User

5+ Year Member

joined:Jan 5, 2011
posts:9
votes: 0


Yes I do tend to look at some of these factors as well. I couldn't find out whether the Moz DA/PA is updated if a site has been penalized. Do you know if this is the case? When doing an analysis on a large set of links I was pasting the links into an excel spreadsheet and grabbing the pagerank score using the SEOtools add-in to see the status of the link & if it had gone to 0. Just wonder if I can do the same with DA/PA..
5:36 pm on Oct 14, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2003
posts: 4319
votes: 42


I don't think you can. All I was talking about was a quick look at links. If you see a lot of free directories and some odd link structure that might give you a clue.
9:04 pm on Oct 14, 2014 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11823
votes: 236


The first thing I always do is search for the title of the site with quotes in Google or a sentence on the front page in quotes. If you don't show up for that there is a very good chance that your site has a penalty.

this test tells you if the home page is indexed but doesn't necessarily indicate ranking penalties.
9:17 pm on Oct 14, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2003
posts: 4319
votes: 42


@pohranque that is the case if what you type in only brings up your site. Many times you would pick a long phrase that is common on many sites. Also many sites have scraped your content. If you can't beat your scraped content there is something wrong.
This 40 message thread spans 2 pages: 40
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members