Googlebot page references robotstxt.org as a de-facto 'robots exclusion protocol'. However, this organization does not list anything regular-expression related, although Googlebot properly understands astericks in robots.txt:
The robots.txt 'protocol' does not support regular expressions, but neither does Google. They support 'wild-cards' as denoted by asterisks, but not formal regular expressions.
Several search engines support various 'extensions' to the robots.txt protocol. Webmasters must take care that these proprietary extensions are only used in robots.txt policy records which apply to those specific robots that support them.
The effects of using a wild-card URL-path in a policy record for a robot that doesn't understand wild-cards might range from 'no effect' to 'disastrous'.
I know it. However, _Google_ calls it 'regular expressions': "Using regular expressions in your robots.txt file can allow you to easily block large numbers of URLs." (see bottom of page) [google.com...]
'protocol' is de-facto what people call it; it does not have any associated RFC: [en.wikipedia.org...]
robotstxt.org was born as a supporting website for (closed now) email@example.com mailing list; their database and info is extremely outdated.