Msg#: 4385347 posted 11:29 pm on Nov 9, 2011 (gmt 0)
Are you asking about your own files, or external links to other people's files? If you're getting this response when checking external links, you can't do anything about it-- and you don't need to. What you quote is a standard Link Checker response. It's not an error; it means exactly what it said. Humans can go there, but robots can't.
The reference is to robots.txt, not to .htaccess
I remember when I first added a robots.txt file, the w3c link checker simply refused to look at anything in one directory. Somewhere along the line it sorted itself out. I have a line
User-Agent: W3C-checklink Disallow:
meaning that it's allowed to go anywhere and everywhere, including places that normal robots aren't allowed to go. In my case, it only ever affected internal # fragment links.