Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
And the other looking like:
As a result I blocked the cold fusion pages in robots.txt like this:
Anyhow, Y! has gone and crawled these pages and I have a pretty good hunch that I have set off a dupe penalty of some sort as the site is absolutely buried in the serps.
Brett's validator shows that wildcards in the disallow field are nonstandard. If that is the case, how can I block the CF pages easily.
You could disallow the .cfm pages by disallowing a left-justified substring that matches the pages that you want to disallow but no other pages. For example, if there is no other page or directory beginning with /p then
will do the trick.