Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
The safest way would be to cloak robots.txt to deliver a disallow all to anything other than Googlebot.
The basic process is to use mod_rewrite to redirect calls for robots.txt to, say, robots.php, and in the latter file, check the IP address and user agent string to identify Googlebot, and then print the appropriate robots.txt declarations. You could even place all IPs other than Googlebot accessing the robots.txt file on a banned list to ensure that they can't spider the site.
Rather complicated, but the only sure way I know of!