Welcome to WebmasterWorld Guest from 50.19.156.19

Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt files

Who to Disallow

     

Uniterra

9:04 pm on Feb 26, 2004 (gmt 0)

10+ Year Member



This might be a silly question, but does anyone disallow robots from visiting a site and if so which spiders do you disallow?

Any help appreciated...
Thanks,
Pete Prestipino

BarkerJr

2:38 am on Feb 27, 2004 (gmt 0)

10+ Year Member



For the "how", check out [searchengineworld.com...]

For the list of spiders you don't like, you have to decide that yourself. I block:
vscooter (Altavista Images)
fast
Googlebot-Image
These are bots that attempt to cache my images, which I don't like. Other people block other bots.

keeper

3:01 am on Feb 27, 2004 (gmt 0)

10+ Year Member



I find the webmasterworld robots.txt file useful as a guide.

jdMorgan

3:51 am on Feb 27, 2004 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



As an alternative to blocking a useful robot such as fast, you can also put all of your "proprietary" images into a subdirectory, and then disallow that subdirectory. Example:

User-agent: fast
Disallow: /images/

Jim

BarkerJr

4:21 am on Feb 27, 2004 (gmt 0)

10+ Year Member


I refuse to redesign my website for a bot I've never heard of coded by people who don't care about users who don't want their images indexed.

I emailed fast and asked them how to stop the bot from collecting images but have it still collect html. They replied with the robots.txt code to disallow /.

I emailed google and asked them the same thing. One month later they introduced the google-image agent. Obviously Google cares about webmasters and fast doesn't.

Pinetree

5:08 pm on Mar 1, 2004 (gmt 0)

10+ Year Member



I have 5 domain names all pointing to the same site.

Can I include all 5 URLs on the same robots.txt file?