Forum Moderators: open
I know duplicate pages are bad, and that pages with 100+ links are bad.
The problem is that the site I am working on has both, and I can't really change it.
The 100+ link page is an inventory listing, and the "duplicates" are product pages. They're not exactly identical, but they're extremely close. The content is basically the product photo, a unique title, and unique product statistics.
I'm forbidden to change the inventory listing by the site's owner, and there isn't much unique content that I could put on the product pages. But I'm willing to try almost anything, because the more product pages we add, the more the site seems to slip in Google. The product pages make up about 65% of the site.
Here are my questions: Does Google consider the layout of the page? (So if I changed the layouts slightly, would they be unique pages?) How much content has to vary before the pages are considered duplicates? And does anyone have any advice?
Why not try this:
1) Use robots.txt to try to request that spiders ignore the big links page (not that i would worry about that myself, but if you are concerned) then
2) Create smaller, more "themed" inventory pages, that still link to all the products, but use a few pages to do so.
OR
1) make all links on the inventory page javalinks, thus scuppering robots from following the links
OR
1) Keep the links that you want the spider to see as they are, and then use java links to all links on the inventory page, thus reducing the link count that the spider see. You can also back this up by using robots.txt to request that all the near dups are not spidered, incase something slips through the net.