I have a client who's CMS adds option codes to the url when a product option is selected by a user.
- this is the product page:
mysite.com/V2/productdetails.php?id=1142 (where id is the product id)
and when a user selects a product option(s), the page reloads with the new url, like so:
The problem is that Google is counting/indexing each option as a new page, and thus is seeing duplicate title tags, and description meta tags.
How do I configure my robots.txt FILE to allow the main product details page, but disallow the versions with options selected? Here's the catch - there are over 1500 product details pages, so "?id=####" ranges from ?id=0001 to ?id=1500 - so adding 1500 lines to my robots.txt file is a bit out of the question...
One solution we have is to change the robots meta tag to "noindex" when an option is selected, but I'd like to do it with the robots.txt file as well...
Thanks in advance!