As of 3 weeks ago I have noticed that Google permuted all parameters in the product search box and attempted to crawl the search listing results pages generated in this way. The search box has calendar for entry to-from dates and Google has been filling in dates and generating lots of "new" URLs.
This was not happening before. I have only noticed this because URLs with date parameters are excluded via robots.txt and now GWT shows lots of restricted URLs. The search on the site is executed via js using location.href after building the URL based on search parameters entered.
We do not want to use rel canonical as the search results are different pages to the listing pages we allowed to be indexed and I would not know to which of the listing page I should set the canonical to. I am now wandering about crawling budget being wasted unecessary owing to MC comment that URLs disallowed in robots.txt are still part of crawling budget.
I am not sure what is the best course of action here?