Forum Moderators: DixonJones

Message Too Old, No Replies

Poll: What web stats service do you use?

I'm curious to see what the webmasters here use...

         

PFOnline

6:39 am on Feb 2, 2003 (gmt 0)

10+ Year Member



Currently using NedStat Basic and Webalizer here.

sarahk

11:42 am on May 21, 2003 (gmt 0)

10+ Year Member



Webanalyse [webanalyse.sourceforge.net...]
php tool, free, on your server

did use stats4all but then they decided to charge (no probs there) but only if they could invoice to a Dutch address. Go figure! then I couldn't get an export of my stats!

Plus didn't report on SE hits so I'm writing something for that: [sourceforge.net...]

kgoeres

6:33 pm on May 21, 2003 (gmt 0)

10+ Year Member



Hitbox Pro

Used to use WebTrends Log Analyzer, but decided to start tracking user behavior instead of server activity.

kellimsf

6:36 pm on May 21, 2003 (gmt 0)

10+ Year Member



Using hits link, web site story and web trends

windybanks

5:49 pm on May 23, 2003 (gmt 0)

10+ Year Member



webtrends, but we are transitioning to a new one, Visitour. Looks good so far altho numbers somewhat different than Webtrends!

Anon_Blond

12:01 am on May 24, 2003 (gmt 0)

10+ Year Member



Click Tracks

NameNick

3:54 pm on May 24, 2003 (gmt 0)

10+ Year Member



After using Webalizer over a quite long period I ended with Weblog Expert. A tool that is simple to use and that comes with nice and foremost useful filter and trackig features.

I'm also using a selfmade PHP script to log all referers, just to keep informed which sites are using our products ;)

NN

sublime1

3:01 am on Jun 11, 2003 (gmt 0)

10+ Year Member



NetTracker. Simple, reliable, and all of the data is there. We live and die by it. I looked at WebTrends -- too pricey.

mnorton

7:48 am on Jun 11, 2003 (gmt 0)

10+ Year Member



I use NetTracker as it Does everything you could ever really need to look at, although could be a bit faster

eaden

10:06 pm on Jun 13, 2003 (gmt 0)

10+ Year Member



webalizer

but this:
[pathalizer.bzzt.net...]
is uber neat too, and I use it. draws a pretty graph with lines showing where people go within your site, but not a complete solution so you would use it in addition to another stats program.

hotice_2002

2:45 am on Jun 14, 2003 (gmt 0)

10+ Year Member



webalizer and userfinger

palmpal

3:58 pm on Jun 14, 2003 (gmt 0)

10+ Year Member



Faststats Analyzer

keyplyr

4:46 pm on Jun 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Analog sans Report Magic

chewy

10:40 pm on Jun 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Anybody using GoToast?

I use Webtrends Live (and really like it - you can have multiple sites there for the $ 35 / month, but the page views can cost if you have high traffic...)

I also use Log Analyzer 7.0 as the 'old standby' because I can see the actual log files.

I tried Web Trends 8 and like it, but haven't got the heavy iron yet to run it like I want to.

I also used FunnelWeb and really liked it, but couldn't reason with the vendor in order to come up with a reasonable price.

Analog I couldn't quite figure out (yet)...

Web CEO didn't seem quite stable yet when I used it a while ago - although it looks quite promising.

I like Urchin as it is so simple and seems to be reasonably close to WebTrends.

I do a lot of looking at raw logs and wish there was something better than the tool in Web Trends 7.

Chewy

prophecy

5:01 pm on Jun 15, 2003 (gmt 0)

10+ Year Member



Ecommstats here

berli

11:42 pm on Jun 16, 2003 (gmt 0)

10+ Year Member



My hosting provider gives me cPanel 6, which has awstats and webalizer. I use them both, since they log url's differently. (Awstats has the annoying habit of treating ALL google and yahoo url's as if they came from the search engine -- even groups.google and groups.yahoo referrals.)

CPanel 6 also has Analog, but it doesn't work right. It seems to read the same log file in over and over, so that the first month shows thousands and thousands of hits now, when there were only several dozen.

I also like to check through my raw logs. They can get unwieldy, so I end up using apple+F a lot :)

Dan_C

8:17 pm on Jun 17, 2003 (gmt 0)

10+ Year Member



Im relatively new to all this, so I only started looking at all these tracker/stats softwares a few weeks back.
To date i've tested several. Urchin was the most disappointing especially considering the cost. I'm currently on Awstats which a few people here seem to recommend/use, bonus was this was free.

Looking at all the stats produced from the softwares available i've come to the conclusion there are only a few pieces of information i am actually interested in

1) Search engine used
2) keywords used on search engine
3) which page they entered on

All the rest that is offered like which operating system the visitor has, the time of the last visit etc etc is all irrelavant to me personally.

prophecy

3:23 pm on Jun 18, 2003 (gmt 0)

10+ Year Member



Those are very important yes, but even more important is which search words did people buy from? How much did they spend? What's the ROI on my ad campaign?

golly_molly

4:25 pm on Jun 18, 2003 (gmt 0)

10+ Year Member



I just got done with a ClickTracks demo and it does all those things.

prophecy

4:58 pm on Jun 18, 2003 (gmt 0)

10+ Year Member



Actually it doesn't do all those things. I don't think it has any concept of money.

I think Ecommstats has the most features for ecommerce sites and selling things on the net.

golly_molly

5:03 pm on Jun 18, 2003 (gmt 0)

10+ Year Member



Hmmm, you're probably right. My site isn't transactional (yet) so I didn't ask that question.

Molly

Dan_C

9:37 am on Jun 21, 2003 (gmt 0)

10+ Year Member



The awstats software i'm now testing also does not appear to do this.

It will give you a list of search engines used and also a list of keywords used, but it won't put the two togther and say which keyword was searched on which search engine.

It would be useful to know whch search terms were working on which search engine.

prophecy

5:05 pm on Jun 21, 2003 (gmt 0)

10+ Year Member



Try out ecommstats, they have a really good free trial, 50,000 pageviews, which lasted me several months.

claus

3:15 am on Jun 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Log analysis beyond the most basic level unfortunately involves a fair amount of guesswork. The danger of leaving to much guessing to the software is that it's very hard to know if the results can be trusted.

Subscribe!

The basic problem with server logs is that they are, unsurprisingly, server logs. What they do is to record the activity of the server. No more and no less.

This gives rise to a multitude of spin-off problems, such as "very large logs that cannot be parsed", "logs from multiple (eg. loadbalanced) servers being uncomparable", and certainly this one "an IP can mean more than one person (company), exactly one person (dsl, own ip), and less than one person (dsl, dyn ip)"

Most striking is the use of the term "hits". Frankly speaking, I am shocked to see the term used when discussing web site traffic :o

A client request actually "hits" the server. It's the right term to use in this exact case. But nothing whatsoever guarantees that (as a result of this hit) a real page will actually "hit" a browser (and be seen by a person).

So, what we have is this situation (extremely simplified, I admit)

  • a range of IPs not corresponding to humans,
  • asking the server for pages, as well as a lot of other stuff (pix, css, js, cgi...)
  • getting what was asked for thrown back at them (if found, if allowed, if..)
  • possibly losing parts in central caches
  • losing other parts in company caches and firewalls
  • losing yet other parts in browser caches
  • and users at the other end. Quite possibly seeing something the server will never be able to record, eg. dynamic HTML/Javascripts

So, for incoming traffic (eg. SE's) logs are a nice tool to find out "what drives traffic to my site", but as soon as the person behind the browser is on the site, navigating, logfile usefulness stops.

The point I am trying to make is simply that what goes on inside the server is something completely different from what goes on in a browser used by a person.

kgoeres put it in a very nice way above:

Used to use WebTrends Log Analyzer, but decided to start tracking user behavior instead of server activity

Because of this, I personally prefer cookie-based measurement methods, especially combined with IMG cgi's to catch JS-disabled browsers.

Being from Europe, I doubt that we have an excess of the kind of ultra-paranoid (corporate?) cookie-control i saw referred by Receptional:

The real problem that I see is that cookie based systems seem to be getting less reliable as firewalls start blocking cookies

Yet we still have user-agents with disabled cookies. And we still have user-agents with disabled JS. We also have browser-sharing in families, net-cafés, and public libraries, so that one cannot always equal a browser to a person.

Still, in my humble opinion, browsers are much closer to end users than web servers are. No pun intended.

/claus

finish_last

12:23 pm on Jun 25, 2003 (gmt 0)

10+ Year Member



Claus,

HI, yes, server logs suck.

I am using Opentracker.net for a few weeks now.
They are browser based, and from what I understand that you are saying, they focus on the human, not the IP address. My understanding is that they track page views (which are human events?) versus hits. They create cookies for individual visitors and track them over the long term. I haven't really decided how I feel about this, I think that as long as I know its happening, it doesn't bother me (as much). I can always delete my cookies.

So they track individual human visitors, and like Dan_C says:

1) Search engine used
2) keywords used on search engine
3) which page they entered on

All of which I see on Opentracker.net

I am not really interested in the stats as much though, as the click-streams. Thats the interesting part for me, in terms of being able to check out what my visitors are doing. I can actually go in and out of people's click-streams, while they are online.

This is human behaviour!

super_seo

6:56 pm on Jun 26, 2003 (gmt 0)

10+ Year Member



I've been spoiled by my past host who had my dedicated server set up with LiveStats, I am now moving to a new server that doesnt have this feature. I am located in Asia where bandwidth is lets say none existant so I can't host my own server, we have a fantastic host provider in the states, although quite expensive, Our hosting costs in excess of 10,000$ a month. Many of our sites log files are upwards of 100 megabytes a day and a few are approaching the 1 gig mark. There is no way from where we are located to download these logfiles, so we need an alanlyzer that runs off our hosted webserver and produces online live stats. I'm wondering if there is anyone here that could suggest an option. We'd like to track 100-200 domains on the server running on average 100 megabyte/site logfiles daily. Our tracking of most of the sites would be quite basic, and only in the case of one or 2 of our larger sites we would like to have more detailed analysis.

Really appreciate any help or suggestion,
cheers

claus

8:44 pm on Jun 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



finish_last:

Generally speaking (service type, not specific company) one of the advantages of such services is that you really do not need to do a lot of irrelevant work by yourself - plug in the code, and read the stats off a browser, that's it.

After all, it's the stats that are the interesting ones, not the pushing and parsing of some amount of mega- or gigabytes. Let alone the technical configuration of a log-analysis tool, which .. well... (lack of words)

A few words of caution, though (and this is not specifically directed against any firm):

You should always make sure that the firm you choose can document how the stats are generated.

A service like this is typically a hosted one. As such, you cannot control what they are doing like you can with a log analysis tool that you configure yourself. They will tell you something, but not all, and that's fine, but beware of firms that tell you nada.

Also, a hosted service means that for each pageview at any site operated by any customer of theirs, their own servers are going to get hit. Not "hit" like in "server hits" more like, say, a form containing information is being sent off to them each time a page is shown at a client site.

The more rich (detailled) the stats are, the more information is constantly being sent back and forth between servers. There's some math here, but i'll do it in plain words:

It means that the larger the customer base is (either large sites, or many sites, or a combination) the larger an amount of data will they need to process at any time. If they provide very rich stats (very detailled information) the processing of data will demand more raw power than a few stats will.

That's two levels of complexity. The third arises the more frequent they update. Do they run one, or a few giant batches to crunch the data, or are they computing in "real time" - the more/frequent updates, the more power needed.

- see the catch? more customers or better product means higher costs.

One way to solve these issues is to buy a lot of machines. Another is to employ some advanced math and database handling. A third is to use sampling. This is actually a subset of option two, so the last option is to wait for smoke to come out of the servers, as customers sign up.

To wrap it up i'd say: Don't go for free options if you want rich stats**. Of course You can, but stick to such services that only provide one or two key metrics and does not update "on the fly". They tend to be more stable in the long run.

(** I know that the service You use is a paid one, this is generally speaking)

A firm providing "rich" stats needs to get cash in order to be able to serve you reliable stats, otherwise it faces option three.

Plus: Even though you're paying, and even though you might even have a small site, it's the total load of all pageviews on all sites using one specific provider that counts. The firm may choose to use sampling, that is: Process only the x'th of each request. That way, it's not the full picture that you get.

Having said that, it might be a good and reliable picture anyway. All sites are different, and sampling of large sites is generally not as obscuring to results as sampling of small sites.

(if you have only two pagewiews and omit one, you will have a sample of one page when sampling 50% - if you have 200PW you will have a sample of 100 when sampling 50%)

This might sound as if i have second thoughts now, but really i haven't. I still prefer this method over logfiles anytime.

Regarding cookies:

I haven't really decided how I feel about this,

- do make a privacy statement. Be honest about it. Tell your audience that their visits are being monitored and do not forget to tell them what the reason is.

You might also like to point out that a cookie is not a program, and that it cannot really do anything. Try showing them the content of a cookie. You could even provide them with guidelines on how to remove cookies.

Really, this is in your own best interest. If people do not know why they are being monitored, or if they find out "by accident" they get nervous. If they are told in plain text what happpens and why, they get familiar in stead.

At the end of the day, the purpose of monitoring web traffic is either:
a) to enable you to offer better services to them, or
b) to help you make money which, in turn, enables you to offer them better services.

(somewhat idealistic view, I admit)

So, really, this is not exactly a thing to keep secret. And they will discover anyway. The "smart users" tend to speak and write emails as well as the not-so-savvy.

If users get nervous about your site... well, do i need to finish this sentence? I'd prefer the familiar ones :)

/claus
...did i say "a few" words...hmmm...

super_seo

4:13 am on Jun 30, 2003 (gmt 0)

10+ Year Member



I can't be the only one here that's had this problem:(

prophecy

12:06 am on Jul 1, 2003 (gmt 0)

10+ Year Member



super_seo: If you read through this thread, you can find a lot of potential options for your situation, such as webtrends live, ecommstats.com, etc..

super_seo

5:46 am on Jul 1, 2003 (gmt 0)

10+ Year Member



Thank you, I guess i was trying to push for some free info. Because it has become pretty obvious i'm going to need to save my pennys for my stats software.

NetTracker Quote 127,000 US$ ++
Webtrends Quote 165,000 US$ ++

and many others i checked were quite pricy. I think I've decided to let my techies write me a primitive one. To get me through the host migration.

Thanks

itrainu

12:03 pm on Jul 1, 2003 (gmt 0)

10+ Year Member



I use FastStats Analyzer. The price was a little more affordable than some of the others and it came with a 30 day trial that I quite liked. I set filters such as ignoring traffic from particular domains, etc. and can save and copy the file for the following month.

If I was able to muscle up the money here in Canada where our dollar is not worth a whole lot, then it had to be good *grin*

Happy Canada Day everyone!

itrainu

This 168 message thread spans 6 pages: 168