Forum Moderators: mack
When a surfer comes to your site their browser will have a useragent. (internet explorer) etc. When a bot lands on your site it will also have a useragent. Google for example is googlebot and altavista is scooter. The useragent is basicaly a "calling card" saying it's me.
Every time your page is accessed the server records the visitor in your log files. The log file is a record of every useragent that has been to your site. From your log files you will be able to see what came to your site, what time, from where etc. There are software tools that are there to help decipher the logs and make them into usefull information.
Hope this is of some help.
Mack.
BROWEXCLUDE *
BROWINCLUDE Googlebot*
Now only the lines in your logfile related to googlebot get analyzed.
Hope this helps.
replace the anolog.cfg file's content with:
LOGFILE "C:\logdirectory\example.log"
OUTFILE report.html
BROWEXCLUDE *
BROWINCLUDE Googlebot*
REQUEST ON
REQFLOOR 1r
the last two lines will show you which pages have been visited by Fredbot.
Anyone have the lines to see a graph of number of visites per day during a month?
Also making use of the site search function helps a lot before making any queries. But this shouldn't stop you from contributing.
The main feature of webmasterworld is welcoming everyone, and trying to help each other out. There is no reason you shouldent feel free to post. You are as welcome to contribute as the next person. It's all about community.
If you have a problem do a search, if you still cany find an answer find a forum where your question would be best suited and ask. Im sure someone will be able to help you.
Mack.