Welcome to WebmasterWorld Guest from 54.196.244.45

Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt files

Who to Disallow

     
9:04 pm on Feb 26, 2004 (gmt 0)

New User

10+ Year Member

joined:Sept 10, 2003
posts:9
votes: 0


This might be a silly question, but does anyone disallow robots from visiting a site and if so which spiders do you disallow?

Any help appreciated...
Thanks,
Pete Prestipino

2:38 am on Feb 27, 2004 (gmt 0)

New User

10+ Year Member

joined:Feb 26, 2004
posts:26
votes: 0


For the "how", check out [searchengineworld.com...]

For the list of spiders you don't like, you have to decide that yourself. I block:
vscooter (Altavista Images)
fast
Googlebot-Image
These are bots that attempt to cache my images, which I don't like. Other people block other bots.

3:01 am on Feb 27, 2004 (gmt 0)

Full Member

10+ Year Member

joined:May 8, 2002
posts:226
votes: 0


I find the webmasterworld robots.txt file useful as a guide.
3:51 am on Feb 27, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


As an alternative to blocking a useful robot such as fast, you can also put all of your "proprietary" images into a subdirectory, and then disallow that subdirectory. Example:

User-agent: fast
Disallow: /images/

Jim
4:21 am on Feb 27, 2004 (gmt 0)

New User

10+ Year Member

joined:Feb 26, 2004
posts:26
votes: 0

I refuse to redesign my website for a bot I've never heard of coded by people who don't care about users who don't want their images indexed.

I emailed fast and asked them how to stop the bot from collecting images but have it still collect html. They replied with the robots.txt code to disallow /.

I emailed google and asked them the same thing. One month later they introduced the google-image agent. Obviously Google cares about webmasters and fast doesn't.

5:08 pm on Mar 1, 2004 (gmt 0)

New User

10+ Year Member

joined:Feb 27, 2003
posts:18
votes: 0


I have 5 domain names all pointing to the same site.

Can I include all 5 URLs on the same robots.txt file?