sauce - 7:43 pm on Aug 31, 2011 (gmt 0) [edited by: tedster at 8:01 pm (utc) on Aug 31, 2011]
Trying to recover an ecommerce site from Panda and I've found tons of dupes in G from sort pages... each sort had 6 functions so basically I have 6 dupe pages of each category/product.
For the sort pages:
<link rel="canonical" href="http://www.example.com/categorypage.php" />
to the headers but I'm wondering if I should add:
<meta name="ROBOTS" content="NOINDEX,FOLLOW">
to the sort page headers also... Will this also deindex: http://www.example.com/categorypage.php ?\
Also, is there any robots.txt regex or something I cna use to block after the ?
ie: Disallow: /*.php?*
[edit reason] switch to example.com [/edit]
[edited by: tedster at 8:01 pm (utc) on Aug 31, 2011]