wilderness - 11:57 pm on Apr 2, 2013 (gmt 0)
Some people advise trying to stop bad robots by testing User Agent strings.
Certainly hope your referring to another forum?
I'm not aware of any participant here that focuses solely upon UA, rather, most everybody uses multiple methods and/or conditions that are available in order to determine their own priorities.
Others white-list (denying all), and then allowing acceptable IP's, UA's and/or various combinations of both.
Others show more diversity:
Same IP address, all within 90 seconds:
188.8.131.52 = DoCoMo/2.0 N905i(c100;TB;W24H16) (compatible; Googlebot-Mobile/2.1;+http://www.google.com/bot.html)
184.108.40.206 = SAMSUNG-SGH-E250/1.0 Profile/MIDP-2.0 Configuration/CLDC-1.1 UP.Browser/220.127.116.11.c.1.101 (GUI) MMP/2.0
18.104.22.168 = Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The above belongs in one of the numerous FAKE GOOGLE threads.
Same IP address, six minutes apart:
22.214.171.124 = SEOstats 2.1.0 https://github.com/eyecatchup/SEOstats
126.96.36.199 = wscheck.com/1.0.0 (+http://wscheck.com/)
188.8.131.52 = bot.wsowner.com/1.0.0 (+http://wsowner.com/)
The above belongs in one of the server farm threads, although it appears that's what this thread has be-reinvented as.