Forum Moderators: Robert Charlton & goodroi
Google looks at all the page code; so sites that are heavy with shared copy, links, promos etc can suffer badly if the 'unique' content is just a para or two.
You can help by reducing page bloat and using css - but creating new and original copy is best for your visitors, the web, Google - and you.
We have a section for a certain general type of product. We have created multiple pages, with more specifics per product (all relating back to the general product) for visitors to have an easier time to navigate. Bottom line, visitors will end up getting to the exact the same page to order the product.
For that reason, all pages have similar structure and same text links as well. The title tags, metas and paragraphs are unique describing specifically the purpose of the specific page.
Most of our pages are not listed in Google, we are trying to figure out what may be the cause? Is it likely that Google is still considering this as duplicates and for that reason is not listing our pages in thier index?
Remember, Google isn't interested in why you did it; merely the duplicate content.
Think of your visitors too (that's why Google does this); is it fair for them to go clicking around your site, waiting for long downloads, then just getting a para per page?
Try to reduce the number of pages,a nd put more content per page (eg instead of red, blue, green and yellow widgets, have one widget page with all four.
There's nothing wrong with using page management systems, and no reason to create every page separately. But reduce as far as possible the 'shared' content, and maximize the unique.
Does every page need every link? I doubt it. Why not have major links on every page, and cut out minor links except within their own 'sections'.
Meta tags should reflect your variety, not be the only source of it - ask your visitors!