-- Search Engine Spider and User Agent Identification
---- Secure Sites From Botnet Vulnerability Probes
botslist - 2:41 pm on May 12, 2007 (gmt 0)
Besides, all the stupid RSS readers can fix their stinking scripts to identify themselves with the following line of PERL code:
But even if they change the ua string, so will the crackers too, don't you think?
Sloppy and lazy programming is never an excuse for allowing a default user agent from a toolkit access to your server. If the programmer isn't bright enough to change it then whatever he/she wrote probably shouldn't be hitting the web in the first place, not my server anyway.
Not my server either (I write most of my software myself in c++, including my main web server). But the fact is that there are just too many badly written programs and scripts out there and the best defense is to avoid them altogether in the first place rather than trying to reduce their damage by adding another layer of protection like filtering the query strings.
Yet I agree with you that if a badly writtenly script can't be avoided for some reason, and if the author won't or can't fix it for whatever reason, then it makes sense to do everything possible to reduce the potential damage.