Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: mack
I am trying to prevent search engine indexing only on certain pages on a website.
For example, I have www.mysite.com/disclaimer.html, /privacy-policy.html, etc. that I do not want indexed.
How do I accomplish this via robots.txt or any other way?
P.S. I believe I understand how to prevent directories from being indexed as in:
EDIT: the pages are not static html, so I don't think I can use the "noindex" page meta tags
[edited by: Emilio at 12:20 am (utc) on Aug. 5, 2008]
Not sure why the page being dynamic prevents you from doing this unless you have no way in your CMS or DB to specify the value for the robots meta tag for each page on your site. We just implemented a CMS system at work, and I made sure that every leaf or branch item type representing a page had a field on it where I can indicate the value. It defaults to "index,follow" but I can select any value from the dropdown.