Which can Googlebot read better, static or dynamic URLs?
We've come across many webmasters who, like our friend, believed that static or static-looking URLs were an advantage for indexing and ranking their sites. This is based on the presumption that search engines have issues with crawling and analyzing URLs that include session IDs or source trackers. However, as a matter of fact, we at Google have made some progress in both areas. While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.
So everyone having problems ranking with their rewrite rules can now see clearly what to do to fix those problems.
As a matter of fact, many ecommerce sites have always used dynamic URLs with parameters for a very long time, since the beginning of ecommerce basically, and always ranked well in Google without rewrite rules so some of this is old hat to a few of us but news to the rewrite rule junkies.