MJBill - 2:06 pm on Jan 2, 2014 (gmt 0)
Thank you Lucy24 and not2easy. This might be a little off-thread, but I had the following idea for blocking rogue bots and brute-force attackers. Being a newcomer though and never having seen it posted by the experts, I figured there is probably something stupid about it which I have missed. My thoughts were to set a 1-second crawl delay on robots.txt (and webmasters etc), and blocking (permanently) anything that hit the site faster than that for more than a few seconds. If it's just the crawl-delay part of it that's stupid (good search results are vital to me), I could check instead for genuine bot CIDRs or even do Google's horrid double look-up, but both of those seem a bit messy. Or is the whole concept a non-starter? Your advice would be much appreciated.