I have a client that is considering an SEO move that I'm not totally sure about and wanted to see what others think before I make a recommendation. Any feedback, particularly relating to actual experience, is greatly appreciated in advance.
Improve long tail keyword rankings with Google.
Large site (several million page views/mo); majority of site made up of categories (think widgets) with hundreds or thousands of paginated listing descriptions under each category or sub-category; each listing description links to a unique page describing that widget in full detail; these category pages with associated listing descriptions as content already rank well for 50+ head terms; approximately 10,000 targeted long tail keywords have been identified; these are primarily derivatives of the head terms; a page has been created for each long tail keyword targeting that phrase; the long tail targeted pages have basically the same content as the main category and sub-category pages, with the exception of unique page titles, meta tags, and headers; client now wants to add links to each of these long tail pages in their XML sitemap.
You have a category page (www.example.com/category1/) with several hundred ordered listings as content (paginated of course). This category page targets 1 head term. Add 50 additional pages that target related long tail phrases (www.example.com/category1/key1-key2-key3/). Each additional page has the same content as the main category page but with unique page title, meta tags and H1 header. Add links to each of these 50 new pages in the XML sitemap to hopefully get picked up by Google.
Duplicate content issues; possibly perceived as black hat; keyword cannibalization.
Could this really work and improve their long tail SERP rankings?
Could this backfire in a major way?
If yes, to potentially backfiring, do you think this is even worth experimenting with, say trying it out on a few dozen pages to see what happens?
Other suggestions for a dynamic SEO approach to 10k long tail keywords?