Welcome to WebmasterWorld Guest from 54.146.180.94

Forum Moderators: goodroi

Need assistance with robot controls

need help with a robot .txt file

   
1:44 am on Oct 25, 2004 (gmt 0)

10+ Year Member



I am having a difficult time with robots. While checking stats I found that MSN was trying to reach subdomain pages through my main domains.

That is, I found MSN spidering
[domain.com...] (a path that does not exist)

It should be spidering [subdomain.com...]

(By 'subdomain' I mean shared hosting. 'www.domain.com' and 'www.subdomain.com' come up as live sites, not 'www.subdomain.domain.com').

I have been advised to "put a robot.txt file in the route that excludes the sub directories of the subdomains. Then put a robot.txt inside the subdomain dir that excludes it from going back to the route" but I have no clue as to how to do this.

Any help is greatly appreciated!

11:31 pm on Oct 29, 2004 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Do you map subdomains to subdirectories or anything like that? -- An error in implementing such a mapping function could confuse the 'bot.

You might also want to implement a 301-Moved Permanently redirect to redirect all requests for www.<subdomain>.domain.com to subdomain.domain.com in order to get the search listings corrected sooner, but look for the underlying problem first.

Jim

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month