---- Every week FAST's crawler fetches robots.txt & leaves without crawling
luma - 1:41 am on Jul 11, 2002 (gmt 0)
Hi jdMorgan, thanks for your answer and thanks for everyone who had been checking.
Your robots.txt looks OK to me... The only thing I see is that your three Disallows for "Microsoft URL Control" are not likely to work - those user-agents likely won't check robots.txt at all. You should probably block these in .htaccess (for Apache server) instead.
I do use .htaccess for some 301s but haven't figured out blocking UAs.
Does Fast index any of the sites that link to your site? If so, they should pick up your site quickly.
Yes other pages linking to me are in.
In your log files, what server code does your server return when Fast requests robots.txt?
Have you recently moved your site?
No. But I only started adding real content and getting links to it a couple of months ago.
Have you tried Fast's "Submit a Site" process?
I am sure I submitted a page or two a couple of months ago (free submit) and did so again a couple of days ago. I might be wrong cause I don't keep notes...
Another "picky" thing about robots.txt is that is a Unix-format file;
I am using Linux myself and checked again but everything seems right.
So, good question - Anyone else?
Thanks for your help. You see, I think I double-checked everything and really can't find anything. :(