Welcome to WebmasterWorld Guest from 54.211.86.24

Forum Moderators: goodroi

Is robots.txt always needed?

robots.txt

   
2:17 am on Oct 3, 2004 (gmt 0)

10+ Year Member



should I have a robots.txt file of some description, even though I do not wish to exclude any files

is the simplest

User-agent: *
Disallow:

I would just like to use some thing like index or archive all pages, but can find no valid postive values for this file all standards I read are to do with exclusion.

I simply wish to return a robots.txt to tell all bots especially googebot that they are welcome to spider my entire site, in preference to returning my custom 404 page.

[edited by: Woz at 3:06 am (utc) on Oct. 3, 2004]
[edit reason] No URLs please, see TOS#13 [/edit]

3:07 am on Oct 3, 2004 (gmt 0)

10+ Year Member



You don't need a robots.txt from what you say. I would advise you don't have one just in case you make a mistake in it. That said if you really want one you could have a blank robots.txt file or one with just this:

User-agent: *
Disallow:

While you say you don't want to exclude some things it is a good idea to exclude a load of bots that will come along and cause pain: You can use this sites robot.txt, delete any bots you do want to crawl and delete the end section which is site specific

You are right their are no postive values such as "Allow": These are two ways to get round it

To exclude all files except one

The easy way is to put all files to be disallowed into a separate directory, say "docs", and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/docs/

Alternatively you can explicitly disallow all disallowed pages:
User-agent: *
Disallow: /~joe/private.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html

4:41 am on Oct 3, 2004 (gmt 0)

10+ Year Member



"I simply wish to return a robots.txt to tell all bots especially googebot that they are welcome to spider my entire site, in preference to returning my custom 404 page."
I added the minimal robots.txt file to my site just to avoid all the 404 messages in my error log. It serves no useful purpose.

User-agent: *
Disallow:

12:24 am on Oct 4, 2004 (gmt 0)

10+ Year Member



I see on google

if you wanted to allow all filetypes to be served.... the robots.txt ... would be:

User-Agent: *
Allow: /

[google.com...]

5:59 am on Oct 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



if you wanted to allow all filetypes to be served

that stops all crawlers form accessing your site

6:39 am on Oct 4, 2004 (gmt 0)

10+ Year Member



User-agent: *
Disallow:

it is!

6:54 am on Oct 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



correct, i should of opened my eyes first ;)
7:37 am on Oct 4, 2004 (gmt 0)

10+ Year Member



I'm with fourstardragon, it is worth having a robots.txt file purely to cut down on 404s when checking logs.
12:23 pm on Oct 4, 2004 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



There was time when search engines that couldn't find a robots text wouldn't spider ..last year the ATW had a problem with this ..lasted about six weeks ..I had one site up without a robots text ( not deliberately ..just forgot to write the *^$)* thing ..and had a sort of blindness every time I looked that meant I didn't notice that it was missing ) ATW came at least twice a day ..requested it ( yes eventually I looked at my logs!) couldn't find it ..went away emptyhanded so to speak ...

In case such things may happen elsewhere ...better to have than to have not

12:38 pm on Oct 4, 2004 (gmt 0)

10+ Year Member



Leosghost you forgot to include your robots.txt, what do you use when you wish to have all files spidered.
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month