I am developing a website where each page offers a time related service for different locations around the world. There is about 22.000 pages on the website. Each page URL is registered in a submitted sitemap:
Now, although the content is clearly different for each page, there is not a lot of text on each page. More some could argue that it is not original content. However, the implementation is cleaner (better GUI), and much faster than competition. I also plan to make those pages work well on SmartPhones, since this is not the case for competition.
I see a possible risk of being penalized for excess of thin content pages.
I am wondering whether converting my page into a unique page with URL query parameters would eliminate that risk:
Hence changing URLs from /product_1.html to /product.html?id=1 will not solve this problem and in fact it will flood your site doubling the number of product URLs.
You would then have to either return 404/410 for the old URLs or redirect them to a new URL version. Then you *could* tell Google via WMT parameter handling to index only one representative URL with id=nnn parameter, providing you do not use id= parameter on some other URL constructs.
It would take Google some time to process your 404/410 and also new URLs and there is a risk that in a meantime the site ranking will tank before recovering (and it may or may not recover to where it was). So personally I would not go for this solution.
What I would do is to leave URLs as they are now and add meta robots noindex on each of these thin pages instead.