homepage Welcome to WebmasterWorld Guest from 54.81.170.186
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
Link checker reports some exclusion
toplisek




msg:4385349
 8:27 pm on Nov 9, 2011 (gmt 0)

I have seen on some links the following INFO warning:
Status: (N/A) Forbidden by robots.txt
The link was not checked due to robots exclusion rules. Check the link manually.

How is this related with robots files?
Is this issue:
<FilesMatch "\\.(js|css|html|htm|php|xml|shtml)$">
SetOutputFilter DEFLATE
</FilesMatch>

 

lucy24




msg:4385390
 11:29 pm on Nov 9, 2011 (gmt 0)

Are you asking about your own files, or external links to other people's files? If you're getting this response when checking external links, you can't do anything about it-- and you don't need to. What you quote is a standard Link Checker response. It's not an error; it means exactly what it said. Humans can go there, but robots can't.

The reference is to robots.txt, not to .htaccess

I remember when I first added a robots.txt file, the w3c link checker simply refused to look at anything in one directory. Somewhere along the line it sorted itself out. I have a line

User-Agent: W3C-checklink
Disallow:

meaning that it's allowed to go anywhere and everywhere, including places that normal robots aren't allowed to go. In my case, it only ever affected internal # fragment links.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved