Forum Moderators: DixonJones
Do you know of any concise list of the most important bot-user-agents, which I can easily copy into a php-array?
The reason is: I found that only 3% of our visitors in total seem to put anything into their shopping cart. I find this ratio extremely low, because I think we have a relatively comfortable usability over all and some of the best landing pages convert with more than 6% orders per visit (20% putting into cart). So I believe this low ratio of 3% total is due to the fact that there are quite a lot of spiders frequently entering the main-page.
It is genrally said that an orders per visit ratio of 1-3% is normal. Do you know about other benchmarks after filtering spiders? Is there a general difference between websites acoording to spider activity? (I could imagine that older websites or "good" ones or frequently changing ones are spidered more often?
But I doubt I am the first one faced with the problem of getting more accurate data in this respect, so I thought someone might help me not to invent the wheel the second time.
I'm a bit stubborn: My very own anaylsis is getting more and more refined and I come to the conclusion that the number of human beings among all traffic should be reduced to (not by!) 10-20 % of all traffic reported by my hoster's stats. One thing to easily detect is official spiders' user agents. Another one is the anonymous scraper-spiders, hiding behind some ordinary mozilla entry, but coming again and again with an interval larger than half an hour, which means that every visit is registered as a distinct visitor.
Traffic definitely isn't what it seems to be, but obviously noone wants to hear that...