Forum Moderators: goodroi

Message Too Old, No Replies

Why doesn't this validate?

Help please

         

Vimes

11:29 am on Feb 10, 2006 (gmt 0)

10+ Year Member



Hi,

thought i understood this robots.txt thing.

when i put this through the validator it has errors on the crawl delay,can anyone help please.

User-agent:*
Disallow:

User-agent: MSNBot
Disallow:
Crawl-delay: 10

User-agent: Teoma
Disallow:
Crawl-delay: 10

Vimes.

[edit] sorry title should say doesn't this validate[/edit]

victor

11:58 am on Feb 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Crawl-delay is recognised by many spiders, and recommended by some.

But it is not part of the robots exclusion standard.

It probably does not harm to the spiders that don't recognise it -- like googlebot.

jdMorgan

4:40 am on Feb 12, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What *could* cause other unexpected problems is that your first record applies to the user-agent "*" -- meaning all robots. Some robots will read that, and go no further, since they will accept either their own user-agent name or "*" whichever they find first. I suspect that msnbot and googlebot are smarter than that, but for best results, the "*" record should be the last one.

Jim

Vimes

11:04 am on Feb 14, 2006 (gmt 0)

10+ Year Member



Thank you Victor and JD,

Much Appreciated

Vimes.