We run a replicated hosting site where we sell e-commerce sites to reps of a particular company.
Example:
domain.com/site1
domain.com/site2
domain.com/site3
The replicated sites are overall pretty much identical.
At the main address of our site we have company news, information, a blog and other info about our services...
Until Panda, we ranked highly for keywords related to this particular company which we provide services for.
After reading about Panda, I'm wondering if we aren't getting penalized for a large amount of internal duplicate content. If this is the case, is there anyway to overcome this? All of the reps who use our replicated sites have identical product catalogs inside their sites which because of company rules can't be altered:
domain.com/site1/store
domain.com/site2/store
domain.com/site3/store
If we block google from all of the internal replicated pages would that help us overcome the hit we've taken?
I sincerely appreciate any ideas you might have!
-Mark