lucy24 - 8:21 am on Aug 30, 2013 (gmt 0) [edited by: phranque at 9:50 am (utc) on Aug 30, 2013]
So far we have added the Disallow
DO NOT DO THIS.
If a page is already indexed, the last thing you ever want to do is block robots from re-crawling it. If they can't crawl it, they will never see a noindex directive.
At this point you have at least three separate issues.
#1 Duplicate pages that are being created by the CMS. It seems to be returning valid pages from invalid parameter values. For every upgrade, there will be one or more transitional patches to keep URLs from going haywire. You are in the right forum to ask about this.
#2 Duplicate URLs that already exist and might be requested by anyone, including search engines. These need to be redirected in htaccess to the appropriate unique pages. It isn't easy, but you can get htaccess to look at value of parameters. For example if "options" is only valid for values up to 3, you'd have a form like
#3 Duplicate pages in google's index. If the problem is with parameter values, rather than parameter names, you'll have to concentrate on redirecting. Once the search engines see that URLs b through g all redirect to URL a, the rest of the list will disappear from the index.
Three problems, three solutions.
[edit reason] disabled graphic smileys [/edit]
[edited by: phranque at 9:50 am (utc) on Aug 30, 2013]