Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies


is it completely neccisary



1:07 pm on Jun 15, 2004 (gmt 0)

10+ Year Member

hi all,

i understand this is to direct the crawlers/spiders to only crawl certain pages in your site.

what is the advantage of this..?
ium not sure i want to stop any of the pages being crawled..
presumably this depends on quite what content and pages i have on offer..
could someone explain a little..
many thanks



1:23 pm on Jun 15, 2004 (gmt 0)

10+ Year Member

If you have no pages that you want to exclude, the only thing a robots.txt file will stop is 404 errors from the spiders. (Not a big deal IMO)


1:34 pm on Jun 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

You can block certain spiders altogether that you may not want to crawl your site. (For example, foreign search engines or image spiders - such as Google's image). This can save bandwidth, particularly if you have a lot of pages or large images.

Featured Threads

Hot Threads This Week

Hot Threads This Month