Welcome to WebmasterWorld Guest from 54.211.101.8

Forum Moderators: goodroi

save and upload robots.txt

   
9:41 am on Feb 2, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi,
I have the problem that Inktomi only reads my robots.txt.
So I checked my robots file and made some changes just to see if anything changes.
My robots.txt is validated.

And wondering how the file should be saved in textpad, read should be in unix mode, but should it be ansi, dos, utf-8, unicode or unicode (big indian)?
Then it should be uploaded in ascii mode.

Donīt know if done correct, know the file in log file is bigger in bit than before, and itīs bigger that on my pc.

11:19 am on Feb 2, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



feel free to sticky me the URL and I'll have a look at it.
7:41 pm on Feb 2, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi,
I did send you the sticky, did you receive it?
8:27 pm on Feb 2, 2004 (gmt 0)

10+ Year Member



Taken from the inktomi web page:

"There is an Inktomi-specific extension to robots.txt which allows you to set a lower limit on our crawler request rate. You can add a "Crawl-delay: xx" instruction, where "xx" is the minimum delay in seconds between successive crawler accesses. Our default crawl-delay value is 1 second. If the crawler rate is a problem for your server, you can set the delay up to 60 or 300 or whatever value is comfortable for your server. Setting a crawl-delay of 20 seconds for Slurp would look something like:
User-agent: Slurp
Crawl-delay: 20
-------------------------- end ------------------------
You could use that specification inside your robots.txt file, but then that robots validator web page would show your file as invalid. So what'll be?

 

Featured Threads

Hot Threads This Week

Hot Threads This Month