Forum Moderators: mack
Thanks very much for any help
I usually do traffic reports once a month.
Sometimes spot checks are good for results on conventional media placement. Some clients want to see results of radio, TV or print adds before investing more in it.
>>Or do log files keep a running track of what's happening for a certain amount of time anyway?
This depends on your host. Some delete logfiles after a given time or size. The best service I get from remote hosts is they 'zip' logfiles montly for me to download and keep it available for a month after.
I can merge them later for yearly stats.
I usually ask analog to output COMPUTER format for later use.
On one of my clients sites I do a quick log check every day (the site gets about 100,000 unique visitors per day). We are running Webtrends Live via the Reporting service and I find it invaluable for this.
I do however, one a month go through the logs in much finer detail, often passing them through Analog for a second opinion!
HTH
Typical logfile size vary a lot for each sites. A lot of them are over a gig per month and take hours to download and 'crunch'. BlobFisk care for a site with 100.000 uniques a day. I believe he is not storing the files on a floppy. ;)
My host deletes log files every Sunday evening. So to get a month of stats, I would need to download the logs once a week before they're about to remove them for a month, and then run them together. That sound right?
Getting a .zip of the entire month sounds streamlined, that would be nice.
I believe he is not storing the files on a floppy.
It would make things a bit easier if I could! :)
In fairness, I only have 2 clients where logs require this level of effort. All the others would typically be once a month (unless there is a campaign that should be generating increased traffic), although I would take a quick peek once a week to see how things are going.
I really enjoy analyzing log files (most of the time)... it's interesting to see variations and taking meaning from the statistics.
One of my sites is about fifty pages, gets about 100 visitors a day, varying; about 2MB of HTTP traffic per day, including about 300kb per day from a banner exchange. Anyway, a small niche SE's spider had beaucoup problems with my forum (PHPBB) one fine day - didn't like the SID's. It hit my site (actually, an error page - PHPBB exhausted it's sessions table and went splat) just over 1,000 times per hour for about six hours straight, before I checked the logs and noticed. I called the fellow running the SE, and he was completely unaware of the problem. Lessons learned: Ban all robots from /forum, and check the logs frequently for signs of problems...