Welcome to WebmasterWorld Guest from 18.104.22.168
If you feel like you need more traffic then you should look at the referring urls to see where users now are coming from. If you think you are having conversion problems then you should follow the paths of individual sessions. If you are thinking about writing new content, you should look at traffic by page to see what's popular with your users right now.
If you are not sure what you need to know about your site, then take a look at every report they offer standard, learn what's going on, and think about those stats in the context of what are you trying to accomplish with your website. Once you do that you should have an idea of the specifics of what to dig into.
Here are several reports to run
1. Exclude * on spider
2. Include Googlebot on spider
3. Filter out your IP
4. If you don't do business outside the US filter by country. (don't forget to go to there site and download the cities and countries db's)
5. If you have important files go to the track files section and see detail reports for that page.
One place that I go to often is the pages section specially when I run the include Googlebot that way I know what gbot has spidered that day or whatever time range I have set.
I'm going to let this run on my logs overnight and see what shows up in the morning. I like what I've seen so-far.
I run a number of different sites, so having something to run locally was a must. My only complaints now are that I can't set different server times for different logs, and I don't see where I can edit the Search Engine files to add other SEs.
Any other tips?
Ogletree....or anyone: How do I know what spages Gbot has specifically looked at or how deep it's crawled? I can't figure this out.
You have to manually open the .cfg files in the Config folder of the installation. Any additions are only recognized upon loading the program, so if you edit config files whilst looking at a report, you will have to exit and reload the program for the changes to take affect.
I guess the next question is going to be, "do you know what the parameters are?" They list engines like this:
Google = google; [b]q[/b]
Yahoo = yahoo; [b]p[/b]
About.com = about.com; [b]terms[/b]
AllTheWeb = alltheweb; [b]q[/b]
Altavista = altavista; [b]q[/b]
AOL Search = aol.com; [b]query[/b]
AskJeeves = askjeeves; [b]ask[/b]
AskJeeves = ask.com; [b]q[/b]
DirectHit = directhit; [b]qry[/b]
There are a whole bunch of different ones used. I'm wondering if these are like the parameters used in Analog config files...at least those are documented ;)
I may try e-mailing their support people. They've already helped me figure out some other issues.
The third element is a parameter name that contains keywords. For example, a Google referrer URL with search information might look like this:
You can that see the name of the parameter that contains keywords is q (q=widget+expert) so this name is included in the SearchEngines.cfg file.