Forum Moderators: goodroi
I wonder if I could run the following by you please - just to check that we have got this 100% right...
We have a site which ranks really well across all engines, but want to exclude some 'duplicate content' which is contained in specific directories which we are using as 'landing pages' for several online campaigns.
Am I right in thinking that the following will exclude JUST the directories named in the root, but will not interfere with any other directories which contain valid, currently indexed content?
User-agent: *
Disallow: /campaign1/
Disallow: /campaign2/
Disallow: /campaign3/
Therefore, all content in these directories will not be spidered and all directories, and .htm/.asp/shtml pages actually contained in the root, will be spidered as per normal?
Thanks very much.