---- Yahoo! Slurp Now Supports Wildcards in robots.txt
bouncybunny - 4:44 am on Nov 10, 2006 (gmt 0)
From your first post, it appears you want to just allow all 'bots to crawl and index everything on your site:
I wasn't trying to give an example of a robots file. I'm just wondering, now that they all support wildcards, whether it is now necessary to specify a separate set of rules for each of the three main bots. Or whether using *, followed by the allow/disallow rules would affect msnbot, gogglebot, and slurp in identical ways. If not, I'm interested in the differences.