Forum Moderators: Robert Charlton & goodroi
There's no "/allow" command in the robots.txt file, so there's no need to add it to the robots.txt file.
Most of the SEO experts make thise mistake of adding Allow in robots.txt
Google supports several kinds of extensions to the Standard for Robots Exclusion. Some of them may be life-savers under certain circumstances - making a daunting job trivial in some cases. For example, their support of wildcard filename-matching, in addition to simple (standard) prefix-matching might come in very handy under certain circumstances.
However, I would never use any of these extensions except in an exclusive User-agent: Googlebot record.
There is simply no telling what any other robot might do with those Google-specific extensions!
As this is the "Robots Exclusion Protocol" everything hinges on this being a disallow list.
...Even though both Bing and Google say they now support a few extensions to the standard syntax, the actual current standard is explained here: [robotstxt.org...]
...and here is Google's Help page: [support.google.com...] If you start blocking some URLs or URL patterns, the details Google provides can become important for getting the exact results that you intended.
and the fact that it is actually a robots exclusion protocol makes google's "Allow:" extension fundamentally unsound.