Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
Suppose I'm thinking about only blocking 1 or 2 bots from my site. But only allow to access my domain root directory.
Therefore I create a robots.txt like this in the root.
user-agent 2 bots blocked
But I then go to my sub directories and block them from all robots by placing a seperate robots.txt file in each of them.
Would this work?
Or is robots.txt only read if it is in the root directory of the domain?
Or are the bots instructed only to search for 1 inclusion of the file per domain?
Curiosity bothers me because nothing seems to cover this idea on any of the lititure listed on the net.
Ref: A Standard for Robot Exclusion [robotstxt.org]