Forum Moderators: DixonJones
The only way to track uniques with any degree of accuracy is by using script/cookie based tracking techniques like Hitbox, Hitslink or Webstat - and even they are prone to inaccuracies.
If your numbers are much higher using webtrends, my guess is that webtrends is at fault rather than Google - server log analysers generally overstate, and if you're judging traffic in order to check payments you should seriously consider the alternatives.
Also, to analyse logs without downloading software you can use scripts like AWStats (which is free) - a perl script which checks your server logs and creates html reports on your site.
I believe Webalizer is also free.
You can also ftp the raw log to your machine, open it in textpad and do some quite quick sorting from that.
Helps if all entrants from the ppc arrive on a special page.
Download your raw log into your text editor or spreadsheet
Delete all lines that are hits for images, scripts, robots etc etc. e.g. Delete all lines containing .gif .jpg .js .ico etc. Delete lines with header reader only and things like 404 403 etc codes. Delete bots like googlebot, slurp etc.
Unless you are getting 10's of 1,000's of uniques a day a 24 hour analysis is then very doable. If you have lots of images or scripts just deleting these lines may reduce your log files by 90 to 95%.
Human review of every single log entry would obviously not be very time effective. I'll read that question as "without buying" any software.
Personally, I wrote my own set of log software tools (C). While I fully realize this is not for everyone, it is the best solution IMHO. It's "free" other than time spent writing the code + updating it.
While on this topic, I have had clients ask about such software (not willing to give up our own). Mostly the type which can provide results per "actions". As in "visitors who viewed both page X (pricing) and page Y (order page)" + where did those come from + keyword phrase.
Any commercial software allow for this? For those familiar with commercial log analyzers, this will probably be a dumb question. Let me apologize now :)
We use Notetab with some custom scripting but any power text editor can be time effective for this task.
In any spreadsheet like Excel or database program like dbase or Access, for another example, its just a matter of filtering and sorting and developing a few custom macros. For us both methods are actually more time effective for certain queries than using most weblog software. But we are only analysing 10,000 visitors a day or so.
Do this to strip our GIFs and JPEGs while viewing only lines that have google in them (Google referers, most likely):
cat /var/log/httpd/access ¦ grep -v ".gif" ¦ grep -v ".jpg" ¦ grep google ¦ less That will display it on the screen on page at a time, use the space bar to page through. To send that out to a file (that you can download):
cat /var/log/httpd/access ¦ grep -v ".gif" ¦ grep -v ".jpg" ¦ grep google > somefilename.txt In the commands above replace /var/log/httpd/access with the actual path to your log file on the server.