i do see the robots listed as shown below
"A description for this result is not available because of this site's robots.txt – learn more"
<snip>
im not sure if i could be a mod re-write issue or something but no matter what robot i put google seems to be indexing the posts.
You can EITHER block crawling OR you can block indexing. You cannot do both. (Do not be unhappy. It took me at least a year to wrap my brain around this concept.)
If it is most important to block indexing, you have to permit crawling. Give each page a meta that says
<meta name = "robots" content = "noindex">
(If I misspelled something there, phranque will come along and fix it.)
If you are already using some kind of CMS-- which it sure seems as if you are-- there is probably some very simple change you can make so this happens automatically everywhere. A mouse click here, a plugin there.
Don't look at me. If you can't get the pages to behave as desired, Option B is to put a small htaccess file in the directory where all your forums live. You can't have <Directory> sections in htaccess, so you have to create a separate htaccess file and put it in the appropriate directory. It would say
:: shuffling papers ::
Header set X-Robots-Tag "noindex"
Just the one line.
i would see the actual listing on google
Do you mean an index entry for the new URL, or the content of the page?
From first post:
this is causing dupe content penalty to my site
If the googlebot can't crawl, how does google know there is duplicate content? Do you factually know that you're being penalized, or are you just getting a bad feeling?
I'm not sure that "penalty" and "duplicate content" really belong in the same sentence anyway. That is, I don't think the algorithm says "OK, this identical content occurs in three different URLs on the site, so we'll drop each one 50 spots from where it would otherwise appear."
netmeg or someone like her would know, but I don't think she hangs out in this subforum.