Welcome to WebmasterWorld Guest from 54.196.144.100

Forum Moderators: goodroi

Message Too Old, No Replies

wildcards in the Disallow field,

non standard?

     

pmac

6:01 pm on Jun 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have site that is built in CF that I rebuild the pages in static html. So, on the server, we have 2 sets of identical pages with one url looking like:

/products.cfm?CatID=38

And the other looking like:

/keyword.html

As a result I blocked the cold fusion pages in robots.txt like this:

Disallow:/*.cfm

Anyhow, Y! has gone and crawled these pages and I have a pretty good hunch that I have set off a dupe penalty of some sort as the site is absolutely buried in the serps.

Brett's validator shows that wildcards in the disallow field are nonstandard. If that is the case, how can I block the CF pages easily.

bakedjake

3:33 pm on Jun 11, 2004 (gmt 0)

WebmasterWorld Administrator bakedjake is a WebmasterWorld Top Contributor of All Time 10+ Year Member



You can't do it reliably through robots.txt.

Use a 403.

tschild

4:19 pm on Jun 11, 2004 (gmt 0)

10+ Year Member



The wildcard is recognized by Google but not generally - it's not in the robots.txt standard.

You could disallow the .cfm pages by disallowing a left-justified substring that matches the pages that you want to disallow but no other pages. For example, if there is no other page or directory beginning with /p then

Disallow: /p

will do the trick.