I download my raw logfile and analyse it with NetTracker pro.
Anyone knows if there's a possibility to filter a single spider (googlebot) and let NetTracker display which pages it grabbed at which time?
Thanks for any replies.
finoo
12:33 pm on Apr 21, 2003 (gmt 0)
I use Ultraedit, and sort the lines by ip.. That way i can see the how the bots crawl my sites in a readable way :)
File -> Sort -> Sort file
matze
6:50 pm on Apr 21, 2003 (gmt 0)
thanks finoo, but I already have NetTracker and would like to use it...
If there's no solution for NetTracker I'll try ultraedit ;)
lorax
7:20 pm on Apr 21, 2003 (gmt 0)
Have you tried going to: Marketing Analysis > Robot/Spider > # visits > # views?
<edited>geesh - for the 4th time - I think it's right now</edited>
matze
8:22 pm on Apr 21, 2003 (gmt 0)
lorax,
thanks for your hint! I hadn't seen that you can click on the #views for every spider ;)
Now I can stop using my texteditor for tracking googlebot ;)
lorax
1:16 pm on Apr 22, 2003 (gmt 0)
NetTracker has a lot of neat stuff in it that I'm still discovering. You can even follow the path a particular visitor travelled through your site on. Very handy little tool. Just wait till you get into custom reports! ;)