Welcome to WebmasterWorld Guest from 22.214.171.124 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Pubcon Platinum Sponsor 2014
Unknown robot eating up bandwidth seomasters msg:3981829 8:57 am on Sep 1, 2009 (gmt 0) Hello,
I am watching my AWStats data and following are eating up my bandwidth heavily. What are those? Can I control them eating up unnecessary bandwidth?
Unknown robot (identified by empty user agent string)
Unknown robot (identified by 'robot') Unknown robot (identified by 'spider')
Thanks in advance for any help.
tangor msg:3983836 4:57 am on Sep 4, 2009 (gmt 0)
Do you have a robots.txt file? Whitelist the bots you want. Nuke (.htaccess) the rest. Cleans up your logs very nicely! bigcat1967 msg:3985798 2:17 am on Sep 8, 2009 (gmt 0)
Tangor is right. In my logs - I saw one of Amazon's bots on my site all the time and I "disallowed" this certain bot in my robots.txt file. theantagonizers msg:3988867 7:34 pm on Sep 13, 2009 (gmt 0)
There is a possibility that the bots are not robots.txt friendly. AFTER you try robots.txt restrictions if the problem persists you may try to identify and restrict the IPs. Tangors solution is a nice balance of the two and white-listing is the ideal way of controlling crawler access.