Forum Moderators: buckworks

Message Too Old, No Replies

Subdomains and Duplicate Pages

Do search engines penalize?

         

mdean

10:04 pm on Apr 16, 2005 (gmt 0)

10+ Year Member



We are adding a sub-domain which will carry all of the products we already list on our regular domain. By having duplicate pages, will the search engines penalize us or do they consider a subdomain to be an entirely separate domain in and of itself?

robotsdobetter

10:32 pm on Apr 16, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



They consider a subdomain to be separate domain. You most likely won't get banned, usually they will give you a penalty on the duplicate page for this, where you won't rank very high for the targeted keywords on that page.

I suggest that you don't do it or you could ban the spiders from the new pages with the duplicate content.

mdean

10:58 pm on Apr 16, 2005 (gmt 0)

10+ Year Member



I am not so worried about the duplicate pages on the SECONDARY site being penalized...as we will already have them on our main site. We will not really advertise the second ones, vistors will just see them because they happen to be at the site looking at the other products.

Do you still think we shouldn't do it?

robotsdobetter

6:52 am on Apr 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If your going to do it I suggest that you block the spiders from the new pages or you could lose your ranking for all the pages that use the same content.

mdean

2:46 pm on Apr 17, 2005 (gmt 0)

10+ Year Member



But does it REALLY work to block spiders? I've heard they pay no attention to it anyway.

lorax

5:56 pm on Apr 17, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> But does it REALLY work to block spiders? I've heard they pay no attention to it anyway.

That all depends upon the method used and the spider you're talking about. If you use your htaccess file correctly you can block anyone from getting access to one or more pages of your website. Using nofollow or robots.txt is an iffy proposition at best.