|PageRank vs Rank Monitoring, How to Monitor Website Health?|
Is there really a way to get a reliable, quantified value of Page Rank for a site?
I was thinking that Google won't tell you, and common sense tells me that PR varies for every combination of keywords. But I keep seeing people refer to a PR of 6, e.g.
I've been assuming that the only way to monitor PR is to have a list of keywords that you use to search and then just count the position of the site in the results.
My main site is problematic in that it contains over a million variations of one page and the keywords used by visitors are almost unique to each of those pages. (It takes GoogleBot 3 months of non-stop crawling to reindex my site.)
So maintaining a list of keywords for PR monitoring is dicey. They are grouped under a category of about 35,000 units, so I can search a representative handful of those categories, but those would be secondary search terms in the real world of my users.
I can't use the "most common keywords used" in reports, because those are just very unique keywords that some one individual keeps searching on day after day.
Welcome to WebmastWorld and to SEO!
Here is a quick rundown of PageRank.
-Only Google knows the real PageRank. Every few months they release a snapshot of this data and it is displayed in the browser toolbars. When people say they have a PR6 site, they are referring to Toolbar PageRank/TBPR
-Google occasionally will manipulate TBPR of certain pages in an effort to discourage certain people from abusing the system.
-PageRank used to be the key to ranking, now its value has greatly diminished as other factors have grown in the algorithm. People still debate just how useful PageRank really is in today's algorithm.
-PageRank is totally different from Rank Monitoring. Rank monitoring looks at what rank your url is in the search results for a specific search term. It is hard to do accurate rank monitoring since Google personalizes the search results and universal serps can trigger maps, images, videos and other elements to appear above the organic search listings.
-PageRank is specific to each URL. So it is not possible to have a PR6 site. It is possible to have a PR6 homepage, but each page on your site will have its own PageRank. Ranking is specific to each url & each search term. It is possible to have multiple urls ranking in different positions for the same search term.
To monitor the health of a website, I would start with search referrals. It does not matter if you rank #1 for a search term, if that search term brings in no traffic. I would want to see steady growth in Google traffic and conversions from that traffic. When traffic goes down, I would then look in analytics to try to identify specific search terms that are underperforming, then I would look at rankings.
In general I do care about my PageRank. This is not because I think PageRank directly helps my rankings. I do this because the websites linking to me (and potential link partners) pay attention to PageRank. If my PageRank drops it will be much harder for me to negotiate links and cross-promotion opportunities with other websites.
I think PageRank indicates the 'potential' power of a site but it doesn't reflect the overall health of the site at all.
I have a site which was PR3 when Panda hit and is now PR4 but traffic is worse than ever and traffic levels are nothing compared to other PR4 sites in the niche that are not affected by Panda.
I'm not sure what to look at to determine a site's health these days. Good rankings/traffic can disappear literally overnight.
I wouldn't consider any site that is heavily dependent on Google as "healthy".
Pre Panda, I used to run a specialist site which had its two main pages at PR3. For its niche search terms it ranked better than more generalist PR4 and PR5 sites in the same sector. In the end I got bored with checking PR and concentrated on the actual traffic.
|To monitor the health of a website, I would start with search referrals. It does not matter if you rank #1 for a search term, if that search term brings in no traffic. I would want to see steady growth in Google traffic and conversions from that traffic. When traffic goes down, I would then look in analytics to try to identify specific search terms that are underperforming, then I would look at rankings. |
+1 @ GoodROI
It all comes down to specific business goals. After all the client is expected to invest both in SEO and on improving the website (content, coding, multimedia, etc.)
We need to educate the clients that SEO is measured by specific business goals. Sales, leads if there are no direct sales, newsletter sign-ups etc. Metrics such as Google Toolbar PageRank can only be an indicator which makes no sense to many people, as they won't even know that it is an exponential metric and not linear (please correct me if exponential is not the right word).
People do get confused heavily, especially those in corporate marketing departments ;-)
Great, useful answer and follow up posts! Thanks!
|My main site is problematic in that it contains over a million variations of one page and the keywords used by visitors are almost unique to each of those pages. (It takes GoogleBot 3 months of non-stop crawling to reindex my site.) |
This sounds like it could be problematic from a search engine standpoint, but maybe I'm not understanding it properly. Are your rankings / traffic levels okay right now?
I agree that PR value has diminished a lot however it could be a good metric to follow if made appropriately. Even if you got millions of pages, you'll have greater hubs which condense the most value of your site. I'll try to see if all the pages which are 1 or 2 clicks away from your homepage has PR. Low value website are not website with a low value of homepage PR but are sites which has a lot of NaN PR, which google doesn't even bother to evaluate. Be sure that your category pages has at least more than PR1;2 and keep track of pages with NaN or even PR0.
About rankings tracking I think that even low volume keywords can be helpful to see if your site has been reduced its search visibility, I normally keep track of thousands of keywords, so if I see a drop of many even small keywords I can prevent a drop of more improtant ones.
I don't believe the OP has any understanding of PageRank, which is page-based and not (necessarily) site-based.
PageRank is essentially a mathematical formula which gives a positive value to a page based on a graph of linking on the web, named after Larry Page.
This might be useful: Google PR - PageRank FAQs [webmasterworld.com].
No, my pages are usually pretty low down the list. I just ran three samples, results came back on pages 3, 4 and 7. These may be skewed, as I go them from looking at logs of recent visitor activity. Sometimes it's so far back I stop looking and use a "site:" search to verify they have indexed that page. A lot of my images are showing up on the first page of image searches, however.
My keywords are proper nouns, used in searches of several general interest purposes, and have several established heavy hitters in the area that I serve. Developing external links to my site doesn't seem like an easy goal, my competition has a huge head start in that area, but my user base is growing every month and I'm seeing increased references to my site on Facebook and WikiPedia from those users.
To tell the truth, since I'm not quite ready for prime time in several ways, I don't really want to be on the first page. Page two would be nice. My visitors however, often find me because while a few major sites may have the data they are looking for, I'm one of the few (esp. on this scale) that provides it free and conveniently.
As for traffic levels, they've fallen from about 12k/dy a year and a half ago to about 6-7k/dy. I expect those to increase in the few months, I'm doubling my content. Before the decrease last week in CTR to 1/10th of previous levels, the traffic was good enough. If my wife lost her job (likely), we could have absorbed the hit. Now we can't.
I think it will be easier for me to increase traffic. First by addition of content (almost done), and then by repackaging the content for a different market (which is already crowded, but I'll offer added value with an online tool not offered by anyone). I was thinking about putting that new website on it's own domain, but don't want to get into trouble with millions of crosslinks between sister sites, so may have to share with Site One, even though their markets are different (obstensibly, many people come to Site One with Site Two purposes). (But I posted this on a different thread already...)
You might want to see if your drop in traffic matches up with any of the Panda updates, before they were rolled into the regular updates.
I'm still not understanding your site architecture, but the way you describe it as the same page with different parameters based on the search term makes me wonder if it isn't being mistaken for a search results page, which Google has said specifically they don't really want in the index, and could be a quality signal of some kind.
Appreciate the time you are taking. Let me see if I can find an analogy...
Let's say my site was a real estate site (which it isn't). Each page might have a record for every residential address and for each record you can look up ownership info, market value, maybe some links to those addresses on other useful site, etc. So, you could search your childhood address "123 Main St Anytown, USA" and get a similar mix of search results to what my site does. Each town has a page with a list of street name, each street page has a list of addresses.
My traffic drops were more gradual than what I would expect from G updates. My best guess is that part of the pool of searchers became somewhat satiated. That is, some of them search for the same keywords routinely and see what new (to them) sites show up. Or more likely, G dropped me a little in the results over time. Or both.
[edited by: Robert_Charlton at 6:43 am (utc) on Jul 12, 2013]
[edit reason] fixed spelling [/edit]
welcome to WebmasterWorld, burchy!
|My main site is problematic in that it contains over a million variations of one page and the keywords used by visitors are almost unique to each of those pages. |
i'm guessing you have a template or set of templates that you are filling with some fields from your database to generate these pages.
if a large percentage of the content of each page doesn't change then much of your content will be filtered, so while it may be indexed it won't rank because it looks like duplicate content.
you have to ask yourself what value each page offers to the visitor before you can expect it to rank.
when you have a large number of pages you need a good information architecture and a lot of link equity to get crawled and indexed well.
this thread would be informative for your situation:
How do huge sites get such complete index coverage?
I use php/mySQL to build the pages. The content that changes is typically about 90% of the page. I have no complaints about getting indexed, Google's been pretty thorough and Bing is catching up (since I started using media.net ads). I've used sitemaps for a couple of years with G. I don't have much complaints about rank, either, most of the sites above me are established and well known, although I don't feel about half of them have much to do with the keywords searched. I'll earn a higher rank in time.