Forum Moderators: mack
On one of my new sites my hosting company provides stats using DeepMetrix LiveStats XSP 6.2
I understand most of it - I have two questions though.
1. Is there a known problem with how DeepMetrix reports repeat visitors?
The Site Activity > Repeat Visitors report has a column that says Repeat Visitors/Total Visitors. But the figures there tend to be like so:
675/678
820/821
768/780
Most of my traffic comes from search engines and I know I have very low repeat visitors - the topic of the site is not something you come and check back on everyday. (Google sends me a lot of visitors and I get some from some specialist directories). How come the stats show over 99% of repeat visitors?
2. Are Googlebot's visits counted in the # of visitors? Googlebot shows up everday and makes about 30,000 hits per month (I'm not complaining). Total hits for the month tend to be about 180,000. (I understand that these are not page impressions). Does this mean that a sixth of my traffic is just the Googlebot?
A regular visitor will make "hits" (requests) for your html files, image files, css files, and js files each time a page is requested, so for an ordinary visitor, a pageview is more than one hit. Googlebot will only request the html part, so for Googlebot, a pageview is equal to a hit. (All this in the general case)
So the proportion of pageviews that are caused by Googlebot will be higher than 1/6 of the total.
Pageviews, again, does not equal visitors. Some visitors will look at three pages, some at one, and Gbot will probably look at a whole lot of them, skewing the numbers. So even though it accounts for more than 1/6 of your pageviews, it will probably account for less than 1/6 of your visitors.
I don't know how this particular stats package identifies repeat visitors. If it uses Cookies, then Google is not counted, as Googlebot (and other spiders) don't accept cookies.
/claus
I hadn't quite thought about it but it does make sense that the bots don't accept cookies. The stats still look strange though. But, I get some indications of weird results on two other websites using different stats packages(one uses Hitbox). I suppose it's a matter of learning what part of the stats to trust as being fairly reliable. It seems that most stats packages have some big anomolies.
I'm disappointed to learn that such a large proportion of what I thought were human visitors turn out to be Googlebot. Instead of the few hundred visitors I thought I was getting per day it seems that it may be only one or two humans :-(
Of course Googlebot is always welcome and can take pretty well as much bandwidth as it wants but it's the human visitors for whom the site was made.
You might want to consider using cookie-based stats for visitor tracking. In post #9 of this thread i have compiled a list of links to good threads that discuss log based stats vs. cookie based stats:
Difference between a log analyzer and a stats software [webmasterworld.com]
Personally, i only use logfiles to track spiders and bots. I don't use them for tracking real visitors.
/claus