Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
I want to ban all spiders from one of my sites, but have the other 6 continued to be crawled normally. How should I go about doing this? (I am scared of accidentally banning spiders from all 7 sites).
My guess is to put the following robots.txt file in the same directory where my index.html is for the site I want ignored:
Is that correct? Or would a robots.txt file have to go in the root dir of my virtual account?
Any help would be greatly appreciated.
Put it wherever http://www.ignoredsite.com/robots.txt will access it. And yes, that's probably the same directory as the index file, but it does depend on your server configuration. But just upload it there and test by typing the robots.txt URL into your browser.
If you're worried about the other domains, then test them, too. If you haven't put a robots.txt on them, you should get a 404.