Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies


is it completely neccisary

1:07 pm on Jun 15, 2004 (gmt 0)

Full Member

10+ Year Member

joined:Mar 7, 2003
votes: 0

hi all,

i understand this is to direct the crawlers/spiders to only crawl certain pages in your site.

what is the advantage of this..?
ium not sure i want to stop any of the pages being crawled..
presumably this depends on quite what content and pages i have on offer..
could someone explain a little..
many thanks


1:23 pm on June 15, 2004 (gmt 0)

New User

10+ Year Member

joined:Feb 24, 2004
votes: 0

If you have no pages that you want to exclude, the only thing a robots.txt file will stop is 404 errors from the spiders. (Not a big deal IMO)
1:34 pm on June 15, 2004 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 13, 2003
votes: 0

You can block certain spiders altogether that you may not want to crawl your site. (For example, foreign search engines or image spiders - such as Google's image). This can save bandwidth, particularly if you have a lot of pages or large images.