Welcome to WebmasterWorld Guest from 18.104.22.168
We have a glossary feature on our website that generated different information based on the query string, for example:
would give a page of all words begining with the letter A.
Google indexed each one of those pages, so we have 26 pages (one for each letter) of content. Recently we pared down and consolidated our glossary into one page, so all words, A-Z, appear on one page, example.com/glossary.php
So it would seem now that we have 27 pages of duplicate content (example.com/glossary.php, example.com/glossary.php?page=A, example.com/glossary.php?page=B, etc). We'd like Google to stop indexing any pages with a query string. To do this, we were thinking of checking for the query string on /glossary.php and, if it was there, writing a meta NOINDEX tag to be served to the googlebot. Would this acheive our goal of actually getting those listings with the "?page=" out of the index?
If not is there another or better way?
Thanks in advance.
Added: And serving a 301 is more accurate, what used to be at /glossary.php?page=A has permanently moved to /glossary.php. It isn't "not found" or "gone" it's just in a different place.