Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
But if you can get the log file, download Webalizer (or the log file analyzer of your choice) from [mrunix.net...] It is command line based so it is helpful to use a batch file which will allow you to drag the log file onto it. The batch file will contain a command like the following:
"c:\program files\webalizer\webalizer.exe" -o "c:\program files\webalizer\stats" -R 1000 %1
This puts the results into a stats subdirectory and gives you the top 1000 referrers.
These instructions are for Windows of course.
Even better though, What you need is access to your Raw Access Logs: Does you ISP provide these as a download? If you can get your Raw Access Logs then you can purchase a analysis program which will allow you to, for example, view all trips by Googlebot to your website.
Anyway, hopefully you've got one of two things:
1. Access to a "Last 20 Visitors" feature
2. Access to your Raw Access Logs
freshBot: 64.68.82.* bah, listed for only a few days, short dinner date then it dumps you.
deepcrawler: 216.239.46.* Good bot, it likes you, listed until death do you depart. So don't make him mad or it will divorce your site. You can divorce the Googlebot by using your robot.txt file. It's much cheaper and faster than going to the courts.
So, for Googlebot, I see an entry of
Googlebot/2.1 (+http://www.googlebot.com/bot.html) for User Agent and 220.127.116.11 for IP address. At the moment the xsl transform just shows ALL visits, but I hope to modify the transform to allow me to search for Googlebot/FAST/Jeeves etc ....
Just an alternative to relying on a hosting provider for access to the log files, but might be impractical as the number of visits to the page increase ...
I have a PR1 site (quite new site, I think PR will go up to 4 after one or two updates), and deepbot has visited my site also.
On the log-subject: I use analog for general purpose log file analysis. It crunches 100 mb log files within seconds (only a 1 Ghz machine). If you have access to a unix-machine, I would use analog.
For searching for googlebot, I wrote a shell-script that does this:
cat /var/log/apache/access.log ¦ grep googlebot > /home/myaccount/google.log
Grep filtered out every line that contains the term googlebot. I then "analyse" the file with vi. When vi opens the file, I immediately see the number of lines, which is the number of hits. Quite low-tech but amazingly effective and easy to implement. You can grep the result again to filter out freshbot or deepbot as you wish.
So what is the E.T.A. on the next update? I know some of you guys have documented these lag times. I have a life so I do not have time for that kind of thing. :)
Let's not even talk about the next update. :-)
Deepbot came whilst our site was down for maintenance - will that be it for another month or will it re-appear in the next couple of days?
RankOutsider, if your site was previously in the index then usually the deepbot will look for it again during the crawl. Keep your fingers crossed... it might be back before this session is finished.