Forum Moderators: Robert Charlton & goodroi
I'm now working for a pretty large site / network. We have quite a bit of syndicated content - most of it is served via customized feeds with url's like: partner1.mysite.com, partner2.mysite.com, and some of it is also served on our partners site itself. We have good PR but rank ok (which is just poor optimization imo).
The company has also recently started launching mini sites focused on specific verticals - although the content on these sites is 100% the same as the content on the mysite.com network. My instinct is that without original content, these are a bad idea.
Anyway, my questions are:
- Is G smart enough to know that the content is syndicated so that I don't need to worry about what our partners are doing (G won't see our site as spam ever, will it?)
- Does anyone else have experience with duplicate content on a large scale?
- Do you agree that the mini sites with exact duplicate content are a bad idea, as long as they have nothing original?
Thanks!
So the idea of repeating a part of a content means a Google catastrophe.
You may redesign and rewrite the pages, to ensure that most of the text content on each page is unique, or block additional sites from crawling by Google (robots.txt).
You never know which copy will be hit by duplicate content filter. It may be your main page.
Anyway, I'm going to recommend that we stop creating the mini sites a I think they are a recipe for disaster. I think G is smart enough to understand dup. content as the result of syndication, but when we start launchign mini sites with the same exact content on the same IP they might whack us pretty good someday.