Forum Moderators: goodroi
User-agent: *
Disallow: /images/
--------
The validator states the following:
ERROR There should be atleast 1 disallow line in any Robots.txt.
User-agent: *Disallow: /images/
We're sorry, this robots.txt does NOT validate.
Warnings Detected: 1
Errors Detected: 1
1warningAn empty user agent field was detected. Each User-Agent record should have atleast one disallow line per record. This error may have also been generated due to bad line enders.
User-agent: *Disallow: /images/
LineCode
1 User-agent: * Disallow: /images/
---------
I have tried deleting adn re-entering a CR multiple times after the * character in Word and a plain text editor with the same result, since it implies there is a bad line ender. Any suggestions?
Thank you.
Firstly, don't use Word - use a text editor link Notepad and recreate the file from scratch - Word can add extra characters which will mess up the file. If the validator still chokes on the new file, open it up in your browser - is the line-break in there? If not, if you are using Firefox you can click Ctrl+I and you should see "Type" - it should be listed as
text/plain. If it is something else like text/html then you may have a server misconfiguration problem.
robots.txt source code for http://example.com/robots.txt
LineCode
1 User-agent: *
2
3 Disallow: /images/"
Will the robots still understand this?
[edited by: ThomasB at 2:28 pm (utc) on Jan. 6, 2006]
Maybe they both are having different algos for validating but this works for me.
Previously I used to get the same problem of Googlebot indexing banned directories, I rectify the problem in robots.txt file, uploaded the new file and its done! All the restricted URL’s now seen in my error statistics in SiteMap.