Forum Moderators: open

Message Too Old, No Replies

Does robots.txt exclusions in SafeSearch affect disallow none?

         

killroy

3:41 pm on Apr 11, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google does not list sites in SafeSearch which employ a robots.txt file.

Does this also affect sits that have a minimal:
User-agent: *
Disallow:
file.

Am I better off havign NO robots.txt and bear all th e"file not founds" in my logs?

I hope to find out before the next crawl, thanks.

SN

pixel_juice

7:34 pm on Apr 12, 2003 (gmt 0)

10+ Year Member



Google does not list sites in SafeSearch which employ a robots.txt file.

This isn't true, Killroy. SafeSearch is a filter for adult content etc. and has nothing to do with robots exclusion, which is a method of preventing (polite) spiders from indexing certain files.

Am I better off havign NO robots.txt and bear all th e"file not founds" in my logs?

"file not founds" in your log are 404 errors when a visitor or spider requests a page that is missing. Robots.txt has nothing to do with this either. The only way to get rid of the 404s is to set up some kind of redirect from the missing pages, or add the missing pages themselves.

[edit reason = spelling]

[edited by: pixel_juice at 7:46 pm (utc) on April 12, 2003]

jimbeetle

7:41 pm on Apr 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Kilroy, for the second part of your question, if you do not need any entries in robots.txt your best bet is to upload a 'blank' robot.txt file. Just create a blank document in Notepad or similar, name it robots.txt and upload it. The bots will be happy that they found something and you'll be able to concentrate on any valid 404s that arise.

Jim