-- Search Engine Spider and User Agent Identification
---- Best method of blocking?
jmccormac - 5:53 am on Aug 31, 2013 (gmt 0)
Would the "allow" list be smaller?
Possibly. It is certainly worth considering.
And the long range benefits of allowing the wider IP ranges are not that great unless your sales are in the millions.
It depends on the audience for the website as much as the sales. Blocking entire countries might be acceptable with some sites, especially if there is no financial argument for allowing traffic from that particular country. However a site that has a localised, country-level market and only sells to that country could benefit from blocking countries outside its market. The important point is that there is no one-size-fits-all approach to blocking.
You'll find out over time that it's far easier and less time consuming to keep the blade sharp on your chainsaw than it is to keep your scalpel sharp.
If I was simply basing the approach on detecting problem ranges as they hit my sites, then the A approach might make sense. However I don't use that approach. As part of the work I do on hoster statistics and domain name tracking, the IPs for about 3.6 million DNSes have to be checked (simple country level resolution in most cases) and that produces a list of approximately 3.3 million distinct IP addresses each month. That's separate from the surveys of the website IPs of com/net/org/biz/info/mobi/asia/us/etc. The website IP survey is part of a full web mapping project and it does produce a lot of IP data. There may be a vast difference between this relatively industrialised approach and the "block on detection" approach.