Welcome to WebmasterWorld Guest from 54.167.46.29

Message Too Old, No Replies

Noindex tag: enough to avoid duplicate content?

Has anyone got a definitive answer on this?

     
1:34 pm on Sep 8, 2006 (gmt 0)

New User

10+ Year Member

joined:May 23, 2005
posts:31
votes: 0


I appreciate this subject has been mentioned before, but no definitive answer seems to have come up. We have a number of sites which have duplicate content across them, quite legitimately (the same news is being presented to different industry channels), but we believe this duplicate content is the cause of massive June 27 penalties. Removing the duplicate pages is not really an option, but if we mark the duplicates with a robots noindex tag, would that "remove" the duplication problem in Google's eyes?
7:26 pm on Sept 8, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Yes it would. Been there. Done that.

A site with just 50 000 "real" pages was exposing 750 000 different URLs to search engines. Using robots exclusions, Google has already deindexed all of the alternative URLs. Now the canonical URL for each page of content also ranks a little higher too.

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members