Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

robots.txt -- block one user-agent, allow all others

1:22 am on May 23, 2002 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 20, 2002
votes: 0

I guess that's a rather simple questions, but I can't seem to find a definite answer. Lets say I want to block one "search engine", e.g., ia_archiver and allow all others, is this how my robots.txt would have to look:

# robots.txt for mydomain.com
User-agent: ia_archiver
Disallow: /
# end

Or will I have to add the follwing at the end of the file:

User-agent: *
# end

If a add this, will "ia_archiver" back off as soon as it reads that it's not welcome, or will it interpret the whole robots.txt and match the "global allow" entry?

PS: Hope I am in the right forum. Couldn't find a more appropriate one in the "Forum Index".

1:44 am on May 23, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 27, 2002
votes: 0

User-agent: ia_archiver
Disallow: /

That should be all you need. You can always use a syntax checker - Brett has one at [searchengineworld.com...]


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members