homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / Keyword Discussion
Forum Library, Charter, Moderator: open

Keyword Discussion Forum

Does duplicate content on a sub-domain affect the rankings of root dom
Does duplicate content on a sub-domain affect the rankings of root domain?

Msg#: 4352306 posted 4:13 am on Aug 17, 2011 (gmt 0)

We recently moved a community website that we own to our main domain. It now lives on our website as a sub-domain. This new sub-domain has a lot of duplicate page titles.

We are going to clean it up but it's huge project. (We had tried to clean it even before migrating the community website)

I am wondering if this duplicate content on the new sub-domain could be hurting rankings of our root domain? How does Google treat it?

From SEO best practices, I know duplicate content within site is always bad.

How severe is it given the fact that it is present on a different sub-domain?

Thanks for the feedback!

Update: I need to move this topic to the Google SEO forum. Not sure how to do this.



WebmasterWorld Senior Member 10+ Year Member

Msg#: 4352306 posted 3:28 pm on Aug 17, 2011 (gmt 0)

I wouldn't say that you're getting penalized--you're just missing out of all of the potential traffic to your site by not having your title tags optimized.

Yes, duplicate content is never really a good thing--best option for you at this time is to decide on where the content should reside and use robots.txt and Google Webmaster Tools to make sure the duplicates aren't being indexed.


Msg#: 4352306 posted 11:03 am on Aug 31, 2011 (gmt 0)

From SEO point of view duplicate content is bad and it may hurt your rankings. You said that you have same titles on various pages which means that you are targeting same keywords on various pages which could cause the competition between the various webpages of the your site for rankings. Try to use different titles and remove the duplicate content for high search engine visibility.


Msg#: 4352306 posted 2:47 pm on Aug 31, 2011 (gmt 0)

Thanks Everyone!
Our rankings are back. It was a different issue that caused ranks drop.

Nevertheless, we need to remove this duplicate content. It's user generated content so we don't have much control on it.


I configured "URL parameters" within webmaster tools to let Google know about these dynamic parameters that are causing duplicate page titles. It's been 3 weeks and Google seems to ignore this.

If we add entries in robots.txt to get rid of duplicate URLs, will it be removed from Google index?

As per my understanding robots.txt prevents crawling of the URLs but if there are links pointing to the URLs, then those URLs can not be removed from Google index by using robots.txt.

Would love to hear your thoughts!

Global Options:
 top home search open messages active posts  

Home / Forums Index / Marketing and Biz Dev / Keyword Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved