Robert_Charlton - 8:59 pm on Jun 18, 2012 (gmt 0)
Most of the SEO experts make thise mistake of adding Allow in robots.txt
I don't know about most experts elsewhere, but site search on WebmasterWorld suggests that "expert" opinion here cautions against using "allow" unless you're very careful about it.
Here are two threads, one old, one recent, with some comments on the topic that are worth reading....
Using Allow: / in robots.txt
Google says to use it?
May , 2003
Google supports several kinds of extensions to the Standard for Robots Exclusion. Some of them may be life-savers under certain circumstances - making a daunting job trivial in some cases. For example, their support of wildcard filename-matching, in addition to simple (standard) prefix-matching might come in very handy under certain circumstances.
However, I would never use any of these extensions except in an exclusive User-agent: Googlebot record.
There is simply no telling what any other robot might do with those Google-specific extensions!
Lost all rankings from Google - due to robots.txt
As this is the "Robots Exclusion Protocol" everything hinges on this being a disallow list.
...Even though both Bing and Google say they now support a few extensions to the standard syntax, the actual current standard is explained here: [robotstxt.org...]
...and here is Google's Help page: [support.google.com...] If you start blocking some URLs or URL patterns, the details Google provides can become important for getting the exact results that you intended.