Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
FAST-WebCrawler has visited me twice, but only retrieved one file each time (which I assume was robots.txt).
Checking the correct syntax, I realise I made a small error and I wonder if this is the reason it hasn't done a full crawl. (google and altavista have crawled OK).
but i should have had
Disallow: /~blah/ #don't forget the final slash
Did fast interpret that as Disallow: / (ie disallow the whole site?
Is my interpretation correct? Should I let fast know that I made a mistake or will they come back and look without me notifying them?
The only result of your typo is that files in your top-level directory that start with "~blah" (e.g. "/~blah2.html" would be disallowed, as well as the subdirectory "/~blah/" that you intended to disallow. If no such files exist, then the typo will have no practical effect.
Check out the robots.txt validator page [searchengineworld.com] and the info linked on that page for more info on why you might have a problem. If you don't find any problems, it may just be that fast has found a link to your site while working on another site, and came over to check robots.txt to see if it would be allowed to spider your site later.
I also had some problems with Fast crawling my site. They visited regularly over a period of months but never requested anything other than robots.txt. I would recommend you contact them as I did as they are very helpful and my site is now being deep crawled and indexed...