It depends on what you need to know most about your website.
If you feel like you need more traffic then you should look at the referring urls to see where users now are coming from. If you think you are having conversion problems then you should follow the paths of individual sessions. If you are thinking about writing new content, you should look at traffic by page to see what's popular with your users right now.
If you are not sure what you need to know about your site, then take a look at every report they offer standard, learn what's going on, and think about those stats in the context of what are you trying to accomplish with your website. Once you do that you should have an idea of the specifics of what to dig into.
I use Weblog Expert and love it. One very important thing to do is make sure you set up the time option so that it matches your time. Set your items to truncate tables at 10,000 and Show last 10,000 days in folder. Here are some cool things to do.
Here are several reports to run
1. Exclude * on spider
2. Include Googlebot on spider
3. Filter out your IP
4. If you don't do business outside the US filter by country. (don't forget to go to there site and download the cities and countries db's)
5. If you have important files go to the track files section and see detail reports for that page.
One place that I go to often is the pages section specially when I run the include Googlebot that way I know what gbot has spidered that day or whatever time range I have set.
ogletree I wanted to say thanks for recommending this software. I spent some time this afternoon running this analyzer through its paces, and it's quite impressive. It's fast, not very resource intensive, and best of all for me it seems to work on my Chinese and Japanese logs. The generated graphs for keywords are gibberish, but the tables are fine for the most part. (I've seen a few glitches, but the data is very usable.)
I'm going to let this run on my logs overnight and see what shows up in the morning. I like what I've seen so-far.
I run a number of different sites, so having something to run locally was a must. My only complaints now are that I can't set different server times for different logs, and I don't see where I can edit the Search Engine files to add other SEs.
Any other tips?
Turn off the lookup DNS if you want it to run faster the dns lookup slows it down drasticly. You can filter out other spiders by doing a filter by host. Also you can put an ip in the host field. Maybe you can install the program in several folders and have each one set to a different time zone. It is not the best but it is great for it's price.
I use weblog expert too, but if that is not the best... Then:
What is the best?
>One place that I go to often is the pages section specially when I run the include Googlebot that way I know what gbot has spidered that day or whatever time range I have set.
Ogletree....or anyone: How do I know what spages Gbot has specifically looked at or how deep it's crawled? I can't figure this out.
Bill -> I don't see where I can edit the Search Engine files to add other SEs.
You have to manually open the .cfg files in the Config folder of the installation. Any additions are only recognized upon loading the program, so if you edit config files whilst looking at a report, you will have to exit and reload the program for the changes to take affect.
I guess the next question is going to be, "do you know what the parameters are?" They list engines like this:
Google = google; [b]q[/b]
Yahoo = yahoo; [b]p[/b]
About.com = about.com; [b]terms[/b]
AllTheWeb = alltheweb; [b]q[/b]
Altavista = altavista; [b]q[/b]
AOL Search = aol.com; [b]query[/b]
AskJeeves = askjeeves; [b]ask[/b]
AskJeeves = ask.com; [b]q[/b]
DirectHit = directhit; [b]qry[/b]
There are a whole bunch of different ones used. I'm wondering if these are like the parameters used in Analog config files...at least those are documented ;)
I may try e-mailing their support people. They've already helped me figure out some other issues.
I understand it now (with a little help from the WebLog Expert people).
The third element is a parameter name that contains keywords. For example, a Google referrer URL with search information might look like this:
You can that see the name of the parameter that contains keywords is q (q=widget+expert) so this name is included in the SearchEngines.cfg file.
Do a search with an include Googlebot only then look at the pages section and you will see.