Welcome to WebmasterWorld Guest from

Forum Moderators: phranque

Message Too Old, No Replies

Preventing data mining whilst remaining Googlebot friendly



9:34 am on Aug 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Is this the impossible dream? I am nearly ready to launch a site that I know will be like honey to the swarm of data miners out there. There will be pages full of lists of data, mostly geo/geocode data relating to towns, zips, schools, counties, etc

Is there anything I can do to make the site data miner UNfriendly while still keeping the doors open for Googlebot, MSNbot and the like? I know it's a nye on impossible feat but I was thinking along the lines of spider throttling, which the major bots seem to go along with. Even if I throttle back to one page every 10 seconds, I suppose the patient data miners will still get the goodies.

A couple of other ideas I had:

1. Whitelist of user agents. This won't stop the hardcore guys but should be enough to filter out the script kiddies.

2. Few trap pages blocked in robots.txt and strewn around the place to entice the data miners: presuming I don't have GoogleBot banning themselves, which I've heard happen.

any other ways to block the database eaters?


Featured Threads

Hot Threads This Week

Hot Threads This Month