Forum Moderators: goodroi

Message Too Old, No Replies

Quick 'Sanity Check'

         

markd

10:58 pm on Jan 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Dear all

I wonder if I could run the following by you please - just to check that we have got this 100% right...

We have a site which ranks really well across all engines, but want to exclude some 'duplicate content' which is contained in specific directories which we are using as 'landing pages' for several online campaigns.

Am I right in thinking that the following will exclude JUST the directories named in the root, but will not interfere with any other directories which contain valid, currently indexed content?

User-agent: *
Disallow: /campaign1/
Disallow: /campaign2/
Disallow: /campaign3/

Therefore, all content in these directories will not be spidered and all directories, and .htm/.asp/shtml pages actually contained in the root, will be spidered as per normal?

Thanks very much.

Lord Majestic

2:35 am on Jan 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Its almost perfect - just remove / at the end to cater for all possibilities, ie:

User-agent: *
Disallow: /campaign1
Disallow: /campaign2
Disallow: /campaign3

markd

10:20 am on Jan 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks Your Lordship - will do and thank you for giving it the once over.