Forum Moderators: open
Server Farms - August 2014
I've got heaps of ranges flagged as "robot" based on information in the present thread and its siblings. But they can remain un-blocked for years if all they're ever used for is innocuous websites, not bot-running.
In htaccess, every "Deny from..." line has to be read by the server on every request... It just makes unnecessary work for the server.
...there's no point in blocking a range that will never actually visit.
How do you distinguish between a human-free farm and a provider who just happens to have a human in it with a virus-infected computer?