vphoner - 12:45 am on Jun 30, 2011 (gmt 0)
I have a php script that passes a product number and then calls an external site with that product number. The script works fine. The one problem is that search engines (google in particular) are indexing all these calls to the script as separate pages and can generate hundreds of pages. I would say these could count as "thin" content pages. My goal is to prevent these pages from being indexed, but you cannot insert the standard <meta name="robots" content="noindex,nofollow"> in the script. (I tried and the script hung up and never went to the external link).
The question is, other than blocking a directory in robots.txt, is there a way to get google to stop indexing these pages? Is it a good idea to not have these indexed (given the recent changes in google's algorithms).
Is there another way to call the external link other than the way I have coded it, and also block indexing of the link?
Here is an example of the php script:
//product number is passed through the script
$PRODUCTNUM1234 = $_GET['PRODUCTNUM1234'];
//Go to External link with this product number
header("Location:http://www.external-site.com/product/". $PRODUCTNUM1234 ."/myid") ;