Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
11. How do I block all crawlers except Googlebot from my site?
The following robots.txt file will achieve this for all well-behaved crawlers.
Invalid fieldname. There is no Allow.
There has been some discussion on this for quite some time amongst those who govern the robots.txt protocol. As far as I know, it has not been implemented yet.
The default behavior of the robots.txt file is to allow all unless of course you have a Disallow for that resource.
P.S. Hehehe, I've only had to use that directive once! ;)
Now, when I can get a valid robots.txt file using the Allow directive, I'll consider reformatting. Until the authoritative resource on the robots.txt file protocol states that Allow is now supported, I think it is best to follow the current standard.
However, I would never use any of these extensions except in an exclusive User-agent: Googlebot record.
There is simply no telling what any other robot might do with those Google-specific extensions!