Welcome to WebmasterWorld Guest from 54.145.166.247

Message Too Old, No Replies

Overcoming Panda - What about replicated websites?

   
9:11 pm on Feb 9, 2012 (gmt 0)

10+ Year Member



We run a replicated hosting site where we sell e-commerce sites to reps of a particular company.

Example:

domain.com/site1
domain.com/site2
domain.com/site3

The replicated sites are overall pretty much identical.

At the main address of our site we have company news, information, a blog and other info about our services...

Until Panda, we ranked highly for keywords related to this particular company which we provide services for.

After reading about Panda, I'm wondering if we aren't getting penalized for a large amount of internal duplicate content. If this is the case, is there anyway to overcome this? All of the reps who use our replicated sites have identical product catalogs inside their sites which because of company rules can't be altered:

domain.com/site1/store
domain.com/site2/store
domain.com/site3/store

If we block google from all of the internal replicated pages would that help us overcome the hit we've taken?

I sincerely appreciate any ideas you might have!

-Mark
1:25 am on Feb 10, 2012 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Panda has wreaked havoc on duplicate content of all kinds. I think you'd be well off to not allow those internal duplicates be indexed - it certainly could help, but you'll only find out if you give it a try.

I sure hope those clients of yours weren't hoping for search traffic. If they were, this wouldn't be a good approach for them to use, at any rate.
2:15 am on Feb 10, 2012 (gmt 0)

10+ Year Member



Our clients can customize their homepage, so I think we're going to set their personal homepage to:

index, nofollow

and then all subpages to be noindex, nofollow using the

meta robots tag ...

I don't really know what else to do.

We're also considering adding a blog to the homepage where our clients can contribute content and get some exposure that way. The extra content should help, also.
3:14 am on Feb 10, 2012 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Panda has wreaked havoc on duplicate content of all kinds.


It used to be that a group of words was duplicate only if they were in the same order. Now a group of words can be duplicate in any order and without even having all of the same words. You can see that when you search for exact match titles, swap words around and the same article is returned more often. This is especially true on heavily repeated titles such as those found in product feeds.

As for the subject of this thread I'd try to have only minimal identical text on each site and encourage owners to write their own unique copy. You'll also want to let them use a keyword or company name in the url since "site 1" or "1234567" says nothing about the subject and neither does the tld.
7:18 am on Feb 10, 2012 (gmt 0)

10+ Year Member



Definitely noindex those internal dupes. Can you keep one lone survivor as the good one?

How about giving each rep a blog and encouraging them to write their own take on stuff? Might help.