Frank_Rizzo - 4:04 pm on Feb 12, 2012 (gmt 0)
I did not like whitelisting. I assumed that it was wrong. Why would you want to keep bots out? What happens if there is a new bot and you don't know about it? You may be losing customers?
Then one day you realise that frisking, researching, wondering if to allow or not allow every new bot that comes along is totally pointless. The vast majority of bots are not bots. They are scrapers, scammers, leachers.
Seriously. Run a smart robots php script in place of your robots.txt. Let the true file be read by your favourite crawlers. Stuff the rest of them.