Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
Is there anything I can do to make the site data miner UNfriendly while still keeping the doors open for Googlebot, MSNbot and the like? I know it's a nye on impossible feat but I was thinking along the lines of spider throttling, which the major bots seem to go along with. Even if I throttle back to one page every 10 seconds, I suppose the patient data miners will still get the goodies.
A couple of other ideas I had:
1. Whitelist of user agents. This won't stop the hardcore guys but should be enough to filter out the script kiddies.
2. Few trap pages blocked in robots.txt and strewn around the place to entice the data miners: presuming I don't have GoogleBot banning themselves, which I've heard happen.
any other ways to block the database eaters?