This is driving me a little batty. I keep getting messages in the Google Webmaster Tools about Googlebot finding too many URLs on a site. However, I took action on this message a while ago and robots.txt disallowed all the offending URLs (filter combos, sorts, etc.) to try to fix this error. But I keep getting the message, now citing URLs which are all robots.txt disallowed! What should I do here? Do I need to JavaScript these URLs or switch everything to drop-downs?