The replicated sites are overall pretty much identical.
At the main address of our site we have company news, information, a blog and other info about our services...
Until Panda, we ranked highly for keywords related to this particular company which we provide services for.
After reading about Panda, I'm wondering if we aren't getting penalized for a large amount of internal duplicate content. If this is the case, is there anyway to overcome this? All of the reps who use our replicated sites have identical product catalogs inside their sites which because of company rules can't be altered:
Panda has wreaked havoc on duplicate content of all kinds. I think you'd be well off to not allow those internal duplicates be indexed - it certainly could help, but you'll only find out if you give it a try.
I sure hope those clients of yours weren't hoping for search traffic. If they were, this wouldn't be a good approach for them to use, at any rate.
Panda has wreaked havoc on duplicate content of all kinds.
It used to be that a group of words was duplicate only if they were in the same order. Now a group of words can be duplicate in any order and without even having all of the same words. You can see that when you search for exact match titles, swap words around and the same article is returned more often. This is especially true on heavily repeated titles such as those found in product feeds.
As for the subject of this thread I'd try to have only minimal identical text on each site and encourage owners to write their own unique copy. You'll also want to let them use a keyword or company name in the url since "site 1" or "1234567" says nothing about the subject and neither does the tld.