Forum Moderators: DixonJones

Message Too Old, No Replies

Checking Weblogs

Is there a way to do it without software?

         

kris

9:15 pm on Jan 27, 2003 (gmt 0)

10+ Year Member



Hello,
We currently use Webtrends however need a second opinion. We are in a dispute with adwords over the amount traffic we received from them.

Is there a way to check our Web Logs without any software? Thanks for any help!

Kris

Mardi_Gras

9:21 pm on Jan 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Without any software? No. But you could use a commonly available program like Excel (you may run into the 64,000 row limit, depending on log size) or Access to analyze them. You could also download a free trial of one (or more) of the leading log analysis solutions and use them for the trial period to compare results to what you are now getting.

Receptional Andy

9:39 am on Jan 30, 2003 (gmt 0)



Webtrends judges unique visitors based on IP addresses unless you have successfully implemented their cookie plugin (and even then I wouldn't rely on it...)

The only way to track uniques with any degree of accuracy is by using script/cookie based tracking techniques like Hitbox, Hitslink or Webstat - and even they are prone to inaccuracies.

If your numbers are much higher using webtrends, my guess is that webtrends is at fault rather than Google - server log analysers generally overstate, and if you're judging traffic in order to check payments you should seriously consider the alternatives.

Also, to analyse logs without downloading software you can use scripts like AWStats (which is free) - a perl script which checks your server logs and creates html reports on your site.

Mark_A

9:54 am on Jan 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>Also, to analyse logs without downloading software you can use scripts like AWStats (which is free) - a perl script which checks your server logs and creates html reports on your site. <

I believe Webalizer is also free.
You can also ftp the raw log to your machine, open it in textpad and do some quite quick sorting from that.

Helps if all entrants from the ppc arrive on a special page.

jlr1001

5:05 am on Jan 31, 2003 (gmt 0)

10+ Year Member



Is there a way to check our Web Logs without any software?

Well actually . . . you could do everything by hand if you really wanted. A log file is nothing more than an ASCII text document.

I wouldn't recommend doing this, though.

-jlr1001

chiyo

5:16 am on Jan 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes.. the raw data is nothing but a text file.

Download your raw log into your text editor or spreadsheet

Delete all lines that are hits for images, scripts, robots etc etc. e.g. Delete all lines containing .gif .jpg .js .ico etc. Delete lines with header reader only and things like 404 403 etc codes. Delete bots like googlebot, slurp etc.

Unless you are getting 10's of 1,000's of uniques a day a 24 hour analysis is then very doable. If you have lots of images or scripts just deleting these lines may reduce your log files by 90 to 95%.

marcs

5:26 am on Jan 31, 2003 (gmt 0)

10+ Year Member



>Is there a way to check our Web Logs without any software?

Human review of every single log entry would obviously not be very time effective. I'll read that question as "without buying" any software.

Personally, I wrote my own set of log software tools (C). While I fully realize this is not for everyone, it is the best solution IMHO. It's "free" other than time spent writing the code + updating it.

While on this topic, I have had clients ask about such software (not willing to give up our own). Mostly the type which can provide results per "actions". As in "visitors who viewed both page X (pricing) and page Y (order page)" + where did those come from + keyword phrase.

Any commercial software allow for this? For those familiar with commercial log analyzers, this will probably be a dumb question. Let me apologize now :)

chiyo

7:49 am on Jan 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Human review of every single log entry would obviously not be very time effective<<

We use Notetab with some custom scripting but any power text editor can be time effective for this task.

In any spreadsheet like Excel or database program like dbase or Access, for another example, its just a matter of filtering and sorting and developing a few custom macros. For us both methods are actually more time effective for certain queries than using most weblog software. But we are only analysing 10,000 visitors a day or so.

kris

8:26 pm on Feb 3, 2003 (gmt 0)

10+ Year Member



Great info, thanks for all the help!

jamesa

8:59 pm on Feb 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If your site is on a UNIX system and don't mind getting your hands dirty in the shell, you can use grep to pre-filter your logs making them easier to read through, and smaller if you will import them into a spreadsheet or database.

Do this to strip our GIFs and JPEGs while viewing only lines that have google in them (Google referers, most likely):

cat /var/log/httpd/access ¦ grep -v ".gif" ¦ grep -v ".jpg" ¦ grep google ¦ less

That will display it on the screen on page at a time, use the space bar to page through. To send that out to a file (that you can download):

cat /var/log/httpd/access ¦ grep -v ".gif" ¦ grep -v ".jpg" ¦ grep google > somefilename.txt

In the commands above replace /var/log/httpd/access with the actual path to your log file on the server.