Forum Moderators: goodroi
That's weird... It should work fine. What validator are you using?
How about:
User-agent: Googlebot
Disallow: /some_directory_that_doesn't_exist
User-agent: *
Disallow: /
That might be a work-around.
Two points, just in case: You must have a blank line between records (After a
Disallow line and before the next User-agent line). And you need to write your
robots.txt file using an editor that will not put carriage-return/linefeed pairs
at the end of each line. robots.txt is a Unix file, and it only wants to see a
linefeed. I use Microsoft Word in ASCII mode with "Linefeed Only" selected
when I have to work on a robots.txt file on a Windows sytem. These options
are available in Word's "Save As" dialog box. The Notepad editor will put
CR/LF pairs in the file, and it won't validate.
Jim
I found the error. I had somthing like this on top of the file:
User-agent: *
Disallow: /trap/
Now I think I have to delete that one and instead replace every
Disallow:
with
Disallow: /trap/
not pretty if you ask me, but...