Forum Moderators: DixonJones
I have never seen log analysis programs that I am familiar with able to get this granular(webtrends, webalizer).
The way I have attempted to do it is to download my raw log files and do a search in a text editor. Whic is ok for small log files but there has to be a better way.
So how do people determine which pages google spidered and on what date?
I've heard good things about a shareware program called 'Wingrep' which gives the same functionality on Windows machines.
I've been looking for something that does this. What are you using to track the bot visits - SSI or something similar?