aakk9999 - 5:49 am on Sep 10, 2010 (gmt 0)
Thanks to both, and the relevant thread was useful. I do not have Googlebot specific entry, I have only one entry for all robots which up to few weeks ago seemed to have worked fine for Googlebot too.
Disallowed but crawled pages do not show in SERPs at all, so no problem here. No drop in traffic and rankings are as usual. I am more worried about crawl budget being spent on URLs with permuted dates and the impact it could (or not) have in the future.
Anyway, will change robots.txt to add a separate section explicitly for Googlebot to see if this has any impact. If not, I will ask developers to change the location.href into doPostBack, this should solve it.