This question has probably been covered here before but I couldn't find it when searching.
I run a site with around 10,000 visitors daily and have had good ranking on google for a long time.
A lot of the content is database generated and many of the pages have a lot of input arguments.
Basically I wonder what Google feels about using a script to generate lots of different pages so that I can get the spider to see some of the contents it misses now.
An extreme example is if I have a database with 100 000 items and instead of linking to xxx/display.phtml?id=18500
to show an item I use a script to create 100 000 files 'display1.phtml', .. 'display100000.phtml'
and link to xxx/display18500.phtml
Each of these 100 000 files would have a unique useful content but the navigation would be the same.
How does Google feel about this approach?