Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Does duplicate content on a sub-domain affect the rankings of root dom

Does duplicate content on a sub-domain affect the rankings of root domain?

4:13 am on Aug 17, 2011 (gmt 0)

New User

5+ Year Member

joined:Feb 7, 2011
posts: 12
votes: 0

We recently moved a community website that we own to our main domain. It now lives on our website as a sub-domain. This new sub-domain has a lot of duplicate page titles.

We are going to clean it up but it's huge project. (We had tried to clean it even before migrating the community website)

I am wondering if this duplicate content on the new sub-domain could be hurting rankings of our root domain? How does Google treat it?

From SEO best practices, I know duplicate content within site is always bad.

How severe is it given the fact that it is present on a different sub-domain?

Thanks for the feedback!

Update: I need to move this topic to the Google SEO forum. Not sure how to do this.
3:28 pm on Aug 17, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 28, 2003
votes: 0

I wouldn't say that you're getting penalized--you're just missing out of all of the potential traffic to your site by not having your title tags optimized.

Yes, duplicate content is never really a good thing--best option for you at this time is to decide on where the content should reside and use robots.txt and Google Webmaster Tools to make sure the duplicates aren't being indexed.
11:03 am on Aug 31, 2011 (gmt 0)

New User

joined:July 18, 2011
votes: 0

From SEO point of view duplicate content is bad and it may hurt your rankings. You said that you have same titles on various pages which means that you are targeting same keywords on various pages which could cause the competition between the various webpages of the your site for rankings. Try to use different titles and remove the duplicate content for high search engine visibility.
2:47 pm on Aug 31, 2011 (gmt 0)

New User

5+ Year Member

joined:Feb 7, 2011
posts: 12
votes: 0

Thanks Everyone!
Our rankings are back. It was a different issue that caused ranks drop.

Nevertheless, we need to remove this duplicate content. It's user generated content so we don't have much control on it.


I configured "URL parameters" within webmaster tools to let Google know about these dynamic parameters that are causing duplicate page titles. It's been 3 weeks and Google seems to ignore this.

If we add entries in robots.txt to get rid of duplicate URLs, will it be removed from Google index?

As per my understanding robots.txt prevents crawling of the URLs but if there are links pointing to the URLs, then those URLs can not be removed from Google index by using robots.txt.

Would love to hear your thoughts!

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members