> keep changing names to fool your robots.tx file
Bad bots never even look at robots.txt and even if they did they would ignore its directions. Only good bots (sometimes) obey the directions "suggested" by robots.txt. Bad bots do their own thing and the only way to block them is to detect certain access parameters and return a 403 with no content. This requires eternal vigilance to detect new methods but a good detector can foil 99% of scrapers.