---- Yahoo! Slurp Now Supports Wildcards in robots.txt
bouncybunny - 12:57 pm on Nov 7, 2006 (gmt 0)
No. For larger sites with mixed dynamic and static content, user/member login areas, subscription only content, etc.. Keeping the bots out of certain areas is needed, and being able to wildcard match partial strings will go a long way towards cleaning dynamic URLs in the SERPs, (on Yahoo! if they are the only ones to adopt these ROBOTS.TXT operators).
I think you misunderstood what I was saying, but that's still an interesting post.
My question was aimed more about what the differences were between the three main robot rules. Wild cards are indeed useful. What I was trying to ask was whether it would be neccesary to specifiy different rules for each bot, or whether simply using wildcards in one set of rules would cover all bases.