I guess that's a rather simple questions, but I can't seem to find a definite answer. Lets say I want to block one "search engine", e.g., ia_archiver and allow all others, is this how my robots.txt would have to look: # robots.txt for mydomain.com User-agent: ia_archiver Disallow: / # end
Or will I have to add the follwing at the end of the file:
User-agent: * Disallow: # end
If a add this, will "ia_archiver" back off as soon as it reads that it's not welcome, or will it interpret the whole robots.txt and match the "global allow" entry?
PS: Hope I am in the right forum. Couldn't find a more appropriate one in the "Forum Index".