Forum Moderators: DixonJones
I've been using this on multiple sites also. I just create seperate reports as each site has different needs and different information that I want from them.
I hear web trends is very nice but out of the low budget range that I am in.
Just an aside but thought it was worth throwing out. I now put all log files and reports on a RW CD for safe keeping. I lost around a years worth of log files and never felt sooooo helpless after the fact (tape backup was bad).
Brian
This sounds like a good solution. Quick question though, with your sites do you have to run a redirect on the root to point to the correct domain and folder,and if you do, do you know if it affects the spiders like google?
I am trying to make sense of all this. Right now I have a seperate folder for each site and a seperate hompage file in the main root, with all links pointing to the correct folder. But to run these reports effectively I would have to stick the homepage in the correct site folder to get accurate reports.
if you have only one log file for all the sites then it is problematic, on one win2k host we run 5 different websites using server.transfer on the index page directing to different folders,
using the filters on fast stats we can analyse each sites internal pages unfortunately it cannot tell the difference between the index pages as it considers them all the same,
one solution is to import the log file into access (or any other database program) run queries that seperates out each of the different domains and then save the each query result in a delimited text file format that faststats can interpret.
However, I am curious if this could effect my placement with the spiders.