homepage Welcome to WebmasterWorld Guest from 54.196.198.213
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Addressing a Duplicate Content Issue - Shared Between Your Own Sites
adder




msg:4444141
 8:49 am on Apr 23, 2012 (gmt 0)

Here's a situation:

I run several sites in closely related niches. Some of the sites display the same product reviews, in effect creating a duplicate content issue.

Although the product reviews make up less than 2% of the total site content (and the remaining 98% is original content) I have a feeling I need to address it as soon as possible.

I don't want to remove the reviews from the sites because the content is actually read by the visitors. So I'll choose my most important site as the main host of the reviews. What should I do with the reviews on the sister sites?

1. Use noindex meta, specify the pages in robots.txt and request the page removal via Webmaster Tools

2. Simply remove the text and replace the pages with iframe displaying the reviews.

Which would be your preferred method (or maybe you have another suggestion)?

Thanks.

 

aakk9999




msg:4444182
 10:55 am on Apr 23, 2012 (gmt 0)

Depending on the number of reviews, what about each site having around 1/2 reviews on page and the second half in iframe stopped by robots?

So both pages show all reviews to user, and both pages have on-page review content (half of reviews), beefing up the page content, but there is no duplication.

E.g. one site shows reviews 1, 3 and 5 on-page and reviews 2,4,6 in iframe. The other site does it other way around.

jkdt0077




msg:4444184
 11:03 am on Apr 23, 2012 (gmt 0)

Haha, I love the lengths we have to go to so that Google doesn't penalise us. It's getting to the point where people will be afraid to have any content at all in case it somehow offends Google's uncontrollable algorithm.

netmeg




msg:4444254
 2:01 pm on Apr 23, 2012 (gmt 0)

I actually deal with this quite a bit - I have event sites that are specific to different locations, but some of the events (and the "maintenance" pages) are very similar; in other cases I have repeating events or events that span days or weeks on the same site.

For the most part, as long as the percentage of these is small, I don't really worry about it, and it hasn't seemed to have hurt me any. Note that in the case of products, I wouldn't want to be using a manufacturers description in this scenario.

For some of my longer pages that need to be published across several domains, I will either just put the page on one and link from the other sites, or I will only index the strongest page and noindex/follow the copies on other sites.

If the page really doesn't need to be in the index at all, I'll just noindex it across the board (search results pages, privacy policies, etc)

So far, I've never had problems with algo changes or pandas.

Pjman




msg:4444274
 2:21 pm on Apr 23, 2012 (gmt 0)

I always noindex duplicates that are of less value. I also block it via robot.txt, if it is a directory.

I'm not sure, if you have to nofollow them too. I usually don't.

Any advice on that guys?

Robert Charlton




msg:4444389
 7:34 pm on Apr 23, 2012 (gmt 0)

I'm not sure, if you have to nofollow them too. I usually don't.

nofollowing the pages is a bad idea, as it stops the circulation of PageRank through the site.

I think whether you noindex them would in part depend on the length of these reviews and their function. If the reviews are vendor reviews on an ecommerce site and are intended to provide unique content to areas that use basically boilerplate product description, then the dupes might hurt on several of the sites, and you don't necessarily control which ones. Perhaps providing some additional material on some of the dupe review pages might help.

A lot depends on the authority of the site, and beyond that on the authority of the page. Using templated or dupe content on a well-known PR7 site is a very different experience from using it on an obscure site with less outside validation.

1. Use noindex meta, specify the pages in robots.txt and request the page removal via Webmaster Tools

Note that the robots.txt conflicts with the noindex meta. It will prevent Google from spidering the page and seeing the noindex. Also, requesting page removal is IMO both overkill and possibly dangerous... and if you drop robots.txt it shouldn't be necessary.

adder




msg:4445063
 8:17 am on Apr 25, 2012 (gmt 0)

Thanks everyone for the input.

Providing additional material to make the reviews look unique is not possible because we're talking about several large sites with a huge amount of reviews.

The reviews are there for the benefit of the visitor. Adding content for Google's sake wasn't the main objective. I have already established that I've been hit by Panda, so it's not really a question whether to make Google de-index the reviews or leave them be. It's the question of what's the best way of de-indexing them?

So, @Robert Charlton, you reckon I could put noindex on each review page and then just wait for Google to de-index them?

if you drop robots.txt it shouldn't be necessary

Sorry, didn't quite get this bit...

Robert Charlton




msg:4445079
 8:51 am on Apr 25, 2012 (gmt 0)

you reckon I could put noindex on each review page and then just wait for Google to de-index them?

Yes.

if you drop robots.txt it shouldn't be necessary

Sorry, didn't quite get this bit...


To clarify my last comment, if you don't use robots.txt, Google will remove the page from the index... and requesting manual removal won't be necessary.

Also, make sure you don't use "nofollow" in the meta robots noindex.

Planet13




msg:4445997
 1:58 pm on Apr 26, 2012 (gmt 0)

I have already established that I've been hit by Panda...


What do you think caused you to be hit by Panda?

Was it specifically the duplicate reviews?

My concern is that if it is the same reviews on two different sites, then aren't the products the same?

And if the products are the same, then maybe that is what caused the Panda slap?

Obviously, I don't have a clear understanding of how your sites work, so I hope this is not a distraction from your main point.

adder




msg:4447919
 12:08 pm on May 1, 2012 (gmt 0)

@Planet, thank you for the input.

The products are essentially the same but the product pages have a different feel on each of the sites... besides in this niche we have hundreds (or maybe thousands) of similar sites competing, selling exactly the same products/services.

About the way I established this was Panda:

I pulled up organic traffic reports from before and after the Panda update, divided the site into clear sections and compared performance for each section to the overall performance.

So, let's assume the dip in overall traffic pre/post-Panda is 20%. All sections that are within this range (+/-10%) I would just ignore. Where the difference is significantly higher, I'd take a closer look.

As far as the reviews section was concerned, it literally stopped receiving any meaningful organic traffic after the update while some better performing sections stayed within the normal range or in some rare cases even improved.

Planet13




msg:4447939
 1:18 pm on May 1, 2012 (gmt 0)

Hi there, adder:

The products are essentially the same but the product pages have a different feel on each of the sites...

so is it fair to say that you are trying to sell the same products to two different market groups (i.e., two different demographics)and thus you NEED to have two VISUALLY different styles of websites?

Or is it that there are so many competitors that you just decided that the more websites you have, the more likely that some pages would rank well in google?

If it is the latter, I don't know what to say. all I can say is that goes against most of the advice I have read and received over the last two to three years.

If it is at all possible to combine your sites in to one authority site, I think in the long run it is going to work out better.

However, if you have to keep the look and feel different because you are trying to reach different niche demographic groups, well, that is going to be a bit of a challenge.

adder




msg:4447981
 2:16 pm on May 1, 2012 (gmt 0)

trying to sell the same products to two different market groups


Bang on, that's exactly what it is. Two entirely different demographics and lifestyles. Even different postcodes.

I've been trying to combine the sites and it's mainly persuading certain people... I might get there in the end. What I'd like to do is to delete the sister sites and forget about them.

Anyway, I've now added the noindex attribute to the dupe reviews and will report back with my findings. No movement so far but I expect it might take a couple of months for Google to de-index the offending pages.

Planet13




msg:4448047
 4:24 pm on May 1, 2012 (gmt 0)

Ah, I see. It looks like you are NOT in a position to "make it so," and have to live at the decisions of the boss.

My feeling as a user is if I went to site A and it was geared toward one demographic, then I would want to read reviews FROM THAT DEMOGRAPHIC THE SITE WAS GEARED TO. I would not want to read reviews from users who fit into an entirely different demographic.

But that's just me.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved