Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Noindex tag: enough to avoid duplicate content?

Has anyone got a definitive answer on this?



1:34 pm on Sep 8, 2006 (gmt 0)

10+ Year Member

I appreciate this subject has been mentioned before, but no definitive answer seems to have come up. We have a number of sites which have duplicate content across them, quite legitimately (the same news is being presented to different industry channels), but we believe this duplicate content is the cause of massive June 27 penalties. Removing the duplicate pages is not really an option, but if we mark the duplicates with a robots noindex tag, would that "remove" the duplication problem in Google's eyes?


7:26 pm on Sep 8, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Yes it would. Been there. Done that.

A site with just 50 000 "real" pages was exposing 750 000 different URLs to search engines. Using robots exclusions, Google has already deindexed all of the alternative URLs. Now the canonical URL for each page of content also ranks a little higher too.


Featured Threads

Hot Threads This Week

Hot Threads This Month