JamJar, luv the nic!
It is generally accepted that Google will parse and index dynamic URLs, but also that they restrict the amount of dynamic URLs they will spider. How they restrict the numbers is not officially known, but general concensus is that it is a combination of PR, Depth, and number of URL variables.
The reason being of course that one page could generate 1000s of variations by pulling data out of a database and hence the updates would be in danger of shifting from a Lunar to a Solar cycle - not something we want to see happening.
From my experience,
I tend to build seperate pages per product/category/whatever and hard code as many variables into the page wherever possible to reduce the variables after the ?. Your idea of a/b/c/d would do the same thing and certainly would be recommended.
However, I do have some high PR pages that produce secondary iterations, (page=2 etc) to get all the data displayed and these seem to be indexed down to 8 levels from a pr6 page.
So, to be honest, I would be reducing anything that might hinder indexing if I could. It may be an idea to go back to the tech guys and have another talk with them
For example, what reason did they give for not accepting your ideas and were those reasons sufficient enough to risk losing sales?
Let us know how to get on.