Welcome to WebmasterWorld Guest from 54.198.43.125

Forum Moderators: goodroi

Message Too Old, No Replies

How to exclude from crawling subdomain

exclude the crawling subdomain

     
3:02 pm on Feb 10, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 28, 2006
posts:44
votes: 0


I have a problem. I want to prevent search bots to index all the pages from subdomain.example.com because that is not a unique content and I am afraid that it will harm our rankings.

The text of those pages is something many sites have it on their sites so it is definately a dupliacte content.

Can I used the robots.txt file

and how to do it please

thank you

[edited by: tedster at 3:21 pm (utc) on Feb. 10, 2007]
[edit reason] use example.com, remove specifics [/edit]

9:56 pm on Feb 12, 2007 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3104
votes: 91


upload a robots.txt file to subdomain.example.com/robots.txt

to block all robots the robots.txt should have only the 2 following lines in it:
User-agent: *
Disallow: /

3:37 am on Feb 15, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 28, 2006
posts:44
votes: 0


But do you think we should do it , will it harm our site if we would not?
1:00 pm on Feb 15, 2007 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3104
votes: 91


so it is definately a dupliacte content

i'm not a believer of having duplicate content on a site. it is generally possible with some extra work to make it original content. rewrite it, add new stuff or reformat it. the search engines will filter or penalize duplicate content, from my pov it means the same = no traffic. my personal preference is to block duplicate content if i can't make it unique content.