Forum Moderators: Robert Charlton & goodroi
I have 3 domains - two of them well established, well linked. Google doesn't really want to index the third one (newest), linked from the other two. Only 10% of pages are indexed. Now I have two theories:
1. Domain's history. This domain was already registered in 2006 and 2007 - but I can't find out what was there - archive.org doesn't have anything about it - are there other methods? And even if this domain was full of black-seo in the past - how long will it take for Google to understand that things have changed?
2. This domain differs from my other two, in that most of the pages are identical in about 80%, - this isn't supposed to be some kind of SEO trick - it's just a nature of this website. There are common links to various parts of the service everywhere, plus - unique - user generated content, but usually very short. So a page looks like this:
What's the service about (the same on all the pages)
Recently added stuff (the same on all the pages)
Other common stuff (the same on all the pages)
Unique contents
Other common stuff (the same on all the pages)
So, I'm wondering - how much unique content on a page is enough/not enough?
p.s. site:domain brings n results, after clicking "more" it brings n+1 results (only one more).
p.s.2. If you see a grammar error, please let me know - it will help me improve my English :)
When I work with a site that needs to put a large amount of the same content on every page, I usually suggest an iframe to hold it. That way the duplicated content is not in the source code for every url - only the one url that the iframe holds. This approach often works very well.
[edited by: tedster at 6:08 pm (utc) on Mar. 11, 2008]