Welcome to WebmasterWorld Guest from 54.226.32.234

Forum Moderators: goodroi

Message Too Old, No Replies

Link checker reports some exclusion

     

toplisek

8:27 pm on Nov 9, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I have seen on some links the following INFO warning:
Status: (N/A) Forbidden by robots.txt
The link was not checked due to robots exclusion rules. Check the link manually.

How is this related with robots files?
Is this issue:
<FilesMatch "\\.(js|css|html|htm|php|xml|shtml)$">
SetOutputFilter DEFLATE
</FilesMatch>

lucy24

11:29 pm on Nov 9, 2011 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Are you asking about your own files, or external links to other people's files? If you're getting this response when checking external links, you can't do anything about it-- and you don't need to. What you quote is a standard Link Checker response. It's not an error; it means exactly what it said. Humans can go there, but robots can't.

The reference is to robots.txt, not to .htaccess

I remember when I first added a robots.txt file, the w3c link checker simply refused to look at anything in one directory. Somewhere along the line it sorted itself out. I have a line

User-Agent: W3C-checklink
Disallow:

meaning that it's allowed to go anywhere and everywhere, including places that normal robots aren't allowed to go. In my case, it only ever affected internal # fragment links.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month