I am currently using a shopping cart that is set up to where I have very little control over a lot of features. I disagree with their developers on an issure of the meta tags. The site is set up as index, follow with the exception of my product category pages which are context: no index, follow. Their reasoning about duplicate content did not make any sense, and I'd really like to the categories to be indexed. I can no alter the robot.txt, the sitemap, or the master meta tags as the are auto generetad; however I can add in custom tags. Would adding in a custom one for index, follow work to override the other (note that theirs would show first)? Or is there any other way to get google to index the pages. I've tried linking on the site to the page, and offsite linking and after a couple months the pages still do show up.
Hi jeremy1102, first of all Welcome to WebmasterWorld!
Having two robot meta tags with different content won't help your site much. At best the search engines will ignore one of the two, and worst case the algorithm may cause the tags to be parsed wrong. Some systems automatically ignore a default generated header tag if you enter one manually, so you may want to test that first.
Linking to these pages won't help to get them in the SERPs. The robot meta tag is the information which the search engines use to decide if a page should be listed. The only way to lift this is by changing the source code, but for that you have to convince the developers.