tedster - 5:18 am on Oct 12, 2011 (gmt 0)
Can you disallow a parameter in robots.txt?
Most likely, if the character string that's used as your parameter name doesn't also appear the file and directory structure that the site uses.
This requires a pattern matching wild card "*" within the Disallow rule - that's an extension of the earlier robots.txt specification that Google supports. So imagine you want to disallow crawling of any URL that uses the parameter "pdq".
The rule Disallow: /*pdq would do it. But if your parameter is "sch" and you also have a URL like /kirschwasser.php - then you're in a bit of trouble.