deadsea - 12:15 pm on Dec 27, 2012 (gmt 0)
The problems with complicated urls and parameters usually come on your side, not on Google's side. If you have long urls with lots of parameters, its hard to use them consistently. If you don't use them consistently, then Googlebot will crawl extra pages. You will have problems with duplicate urls. You may not be indexed and ranked like you should be.
It sounds like you have multiple parameters on each url. This can be problematic for several reasons:
1) What happens when the parameters are out of order? Your site should redirect to put them in correct order, or make sure you have a canonical tag with them in the correct order. Otherwise Googlebot will see two different urls for the same page.
2) What happens when a parameter is missing? You should either show an error (with error status) or redirect to fill in the default value for that parameter. If your server assumes default values for parameters and reacts the same as if the parameter were actually in the url with that default value, you will have similar problems to out of order parameters.
3) Are you using parameters that are customized per user? Things like tracking parameters, which page a user came from, last search a user performed, etc. If so, Googlebot shouldn't be crawling pages with these in it. It will just confuse it. There are some setting under "url parameters" in Google Webmaster Tools that you can use to help in this case, but I would recommend not using parameters like these on crawlable urls.
If you can use urls with parameters without these pitfalls, then there is no big SEO advantage to rewriting your urls. I find that it is very hard to have parameters and keep the discipline needed. Developers have a tendency to add parameters any time they need to. Url rewriting is one way to make them aware that there are SEO considerations when doing so.