Forum Moderators: open

Message Too Old, No Replies

Block All Spiders

How do I setup the robots.txt?

         

jady

4:04 am on Jan 20, 2003 (gmt 0)

10+ Year Member



Just a quick question on something that I need help with. We have a Client login site setup and want to block the entire site and all of the contents from any SE spiders. (never thought we would actually say that we dont want Googlebot to crawl anything on this site) - it is just for our Clients to view their work.

How do I setup Robots.txt to disallow all from all directories?

Does this work;

User-agent: *
Disallow: /

Thanks for any help/advise! :)

Macguru

4:07 am on Jan 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes that will work, exept, I would call the file "robots.txt" instead of "Robots.txt".

ukgimp

9:25 am on Jan 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Remeber this will only block well behaved bots from lurking in a private area. There have also been cases (small number) where Googlebot has been found lurking around in disallowed areas.

If you have an area where you need to be sure block prying eys and bots use the HTACCESS file to do this. It is better to be safe(r) than sorry.

Cheers