Welcome to WebmasterWorld Guest from 188.8.131.52
...what if you're intimidated by the idea of communicating directly with Googlebot? After all, not all of us are fluent in the language of robots.txt. This is why we're pleased to introduce you to your personal robot translator: the Robots.txt Generator in Webmaster Tools. It's designed to give you an easy and interactive way to build a robots.txt file. It can be as simple as entering the files and directories you don't want crawled by any robots.
I think, those webmasters, who feel the need to exclude robots from certain parts of their own website (for bandwidth- or whatever other reasons) are strongly recommended to acquire decent knowledge on robots.txt-syntax on their own, and I assume most definitely have done so.
So the only reason, why this tool makes sense, is probably google's own bandwidth: If webmasters help to lead adsbot to only the relevant parts, there is no need for these bots to crawl the whole site. But do their bots anyway? Does adsbot follow links and crawl pages other than those defined as target URLs in the campaigns? Did you experience images-bot crawl the whole webmasterworld-site in search for thousands of copies of this beautiful world-map- or visa-logo?;)
there is no official "allow"-statement
You are correct, Allow rules are an extension to the original robots.txt protocol, as are pattern matching wild cards [google.com] and indicating your Sitemap location [google.com]. The article mentions this:
The Robots.txt Generator creates files that Googlebot will understand, and most other major robots will understand them too. But it's possible that some robots won't understand all of the robots.txt features that the generator uses.
I'm thinking that the robots.txt generator in Webmaster Tools might also help people avoid situations like this one:
Why Google Might "Ignore" a robots.txt Disallow Rule [webmasterworld.com]
I have a client who publishes ads - not adsense - they have propietary ad serving technology and have a number of local advertisers as customers. The issue is that bots are indexing the pages of the site and then clicking on the ads and therefore adding fake cost to the advertisers who aren't happy about it.