Forum Moderators: open
QUESTIONS: Is this normal? Is 10-16% bot traffic a good thing or bad thing - a necessary evil? Should I attempt to block bots that aren't worth allowing? Or should I tell the client to just accept the fact that bot traffic is what happens when you are doing well in the search engines? I've tried to run WebTrends reports that filter bot traffic, but they still slip through because many are just IP addresses. Is there a way to positively identify a visitor as a bot?
Much thanks in advance for any and all input.
Also i would really recommend that you filter out the traffic that the bots stand for in your report.
There is a lot of sites which lists thousands and thousands of ip's/bots which you can filter out, try doing a search on Google, and view previous threads in this forum, if you want some inspiration for a robots.txt file check www.webmasterworld.com/robots.txt
When your site becomes more popular (i'm not sure how popular the site you are talking about is)
but if you have a more than 1000 visitors per day, and 10-16% of those is bots, you can save some money in bandwidth costs, and if you had a site like this (Yes we all wish we had), i guess you can save huge amount of money in Badwidth costs, you also prepare yourself for bad bots, which might cause big server problems in overload etc.