Welcome to WebmasterWorld Guest from 126.96.36.199 , register , login , search , subscribe , help , library , PubCon , announcements , recent posts , open posts Pubcon Website
wildcards in the Disallow field, non standard? pmac msg:1528002 6:01 pm on Jun 10, 2004 (gmt 0) I have site that is built in CF that I rebuild the pages in static html. So, on the server, we have 2 sets of identical pages with one url looking like:
And the other looking like:
As a result I blocked the cold fusion pages in robots.txt like this:
Anyhow, Y! has gone and crawled these pages and I have a pretty good hunch that I have set off a dupe penalty of some sort as the site is absolutely buried in the serps.
Brett's validator shows that wildcards in the disallow field are nonstandard. If that is the case, how can I block the CF pages easily.
bakedjake msg:1528003 3:33 pm on Jun 11, 2004 (gmt 0)
You can't do it reliably through robots.txt.
Use a 403.
tschild msg:1528004 4:19 pm on Jun 11, 2004 (gmt 0)
The wildcard is recognized by Google but not generally - it's not in the robots.txt standard.
You could disallow the .cfm pages by disallowing a left-justified substring that matches the pages that you want to disallow but no other pages. For example, if there is no other page or directory beginning with /p then
will do the trick.