Str82u - 6:57 pm on Mar 28, 2013 (gmt 0)
@g1smd does this only apply to Google? [developers.google.com...] - way down the page they have:
disallow - The disallow directive specifies paths that must not be accessed by the designated crawlers. When no path is specified, the directive is ignored.
allow - The allow directive specifies paths that may be accessed by the designated crawlers. When no path is specified, the directive is ignored.
I have to agree, [robotstxt.org...] does mention it:
To exclude all files except one - This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:
We use "allow" and "disallow" with no problems that are apparent.