Welcome to WebmasterWorld Guest from 188.8.131.52
I don't know if there is any parameter that can be considered related to a human visit other than the user agent. It is impossible to keep an up to date listing of browsers ua or robots ua (search engine bots, site downloaders, email harvesters, etc.).
Is here any trick to recognize the human visits? Perhaps calculating the time spent on a page?
Any input will help
Looking for a fast series of pages could be a good one.
Perhaps checking if an IP requested more than 30 or 40 pages in one session could be another way?
So far the way o recognizing non human visits is:
1) If the ip checks for robots.txt
2) If the ua has certain keywords
3) The speed of requests
4) The quantity of requests in a single session
However, this program seems to use exactly the technique mentioned above. It indicates spider visits first by a list of known spider user agents, but then it has some heuristics which tries to detect previously unknown spiders. A dead giveaway is the request of robots.txt (even though I myself try to request it). Also not accepting cookies seems to be typical bot-behaviour.