Msg#: 4012486 posted 3:05 am on Oct 24, 2009 (gmt 0)
the robots.txt matching is left-to-right, so if you Disallow: /blue-widgets/ that will exclude anything in the blue-widgets directory. note that you don't specify the domain in robots.txt. the pattern matching is limited so i'm not sure how you could specify a range of numbers.
if those urls are already indexed, blocking with robots.txt may not really solve your duplicate content problem nor will it prevent your url from being indexed in the future. the proper way to solve this is probably to use 301 redirects to the canonical url.