Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Online Store Content?

Multiple product views ... do they have a negative impact?

         

eEagle630

4:10 pm on Aug 30, 2005 (gmt 0)

10+ Year Member



I currently manage a few online stores and allow customers to browse products using a variety of different methods ... such as "Browse by Theme" or "Browse by Company." The product URL is differs dependant on which method you are using to view the product. The content for the page, however, is essentially the same no matter how you got to the product.

Should I choose one method to allow google to index and then use a noindex/nofollow meta tag for google in the other product views? Is G going to penalize based on duplicate content with multiple product views within the same site? And should I only publish one particular view in the sitemap or all the views of each product?

- eEagle630

g1smd

9:40 am on Sep 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you can get to the same content using a different URL to see the same page, then that is duplicate content.

You could put <meta name="robots" content="noindex"> on one version of the page, but do make sure than does not cut the bot off from being able to spider the rest of the site.

Put rel="nofollow" on "buy" links to stop those being indexed too.

Make sure that URLs are always constructed the same way:

shirts.php?size=20&colour=blue&logo=75 and
shirts.php?colour=blue&size=20&logo=75 and
shirts.php?colour=blue&logo=75&size=20 etc are all the same page, but are duplicates.

topr8

10:27 am on Sep 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



imho you don't need to worry at all.
(i'm assuming you mean duplicate pages accross a single domain)

what you are doing is a totally legitimate use of your site ...

here's how dupe content penalty works - google decides which duplicate page is more important (using their own criteria), this page is then ranked in the normal way, the other (duplicate) pages are then downgraded/not ranked

... therefore you will still get a page that ranks.

the problem with dupe content is when it resides on different domains (like when someone has stolen your content) because it might not be 'your' page that ranks.

morags

11:13 am on Sep 2, 2005 (gmt 0)

10+ Year Member



I have a similar problem - but, by necessity, the content is displayed on two seperate sites (different domains). I know that this WILL trigger a duplicate penalty.

I have considered blocking bots fom one of the sites, using robots.txt - meaning just one of the sites will be indexed. This is not a problem. However, I have read (here on WebmasterWorld) that G will index a page if it is linked to from elsewhere (without regard to robots.txt). This doesn't sound right.

Is this true? Would I be safer using "noindex" in the meta tags instead?

g1smd

11:29 am on Sep 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It will not index the page, but will simply list the URL as a URL-only result in the SERPs - simply because Google has seen the URL and knows that it exists. What it will not do is index the content of those pages.

etrader

11:36 am on Sep 2, 2005 (gmt 0)

10+ Year Member



I have created my own review pages myself from scratch, however I have copied feature lists from amazon - they are in the form:-

aperture item
shutter speed item
power soucre item
ect - these are pretty hard to change imo and be original

Will I have duplicate problems with this data?

morags

11:44 am on Sep 2, 2005 (gmt 0)

10+ Year Member



Thanks g1smd. It's what I thought. Will PR still pass through the non-indexed page? ie if the page is getting incoming PR, I'd still like to pass it around the site (to pages which have no duplicate content of course).