| 1:40 am on Jun 15, 2010 (gmt 0)|
I assume that what you posted is the HTTP request Accept header.
What Content-Type header does one of the pages you're trying to validate return?
Use the Live HTTP Headers add-on (or similar) for Firefox/Mozilla to check.
| 6:58 am on Jun 15, 2010 (gmt 0)|
I don't think he'll be able to see, because the 406 is returned back to the W3C link checker, and then shown as an error message on the link checker webpage.
Directly accessing the page with a browser might yield a different result.
Does the page in question need cookies to operate?
| 1:07 pm on Jun 15, 2010 (gmt 0)|
Have a look at all the "accept" headers and then watch the response headers as suggested. I have seen this happen with an improper Content Negotiation [httpd.apache.org] configuration in the Apache conf files, particularly language and variant directives.
| 6:39 pm on Jun 15, 2010 (gmt 0)|
|Content-Type header does one of the pages you're trying to validate return? |
Right, the above Accept: was from Live HTTP Headers. The content-type output is text/html.
|Does the page in question need cookies to operate |
No, I'd never do that. :-) Most of this site is PHP, (PHP Version 5.2.9, guessing default configuration) but I've tested static pages, same deal: 406 from anything on this domain from the W3 link checker. I used LinkChecker 5.2 from a windows machine, and it returned 200 OK's for all good pages.
An interesting side note, I'm able to validate pages with the W3 validator on this site. Just not link check them.
I have .htaccess directives, but all of them are just mod_rewrites for specific pages and conditions. There are no deny directives . . . at all.
The robots.txt file is something I checked today (after a "duh" moment) but it's only disallowing some items:
(I didn't create this, if anything's amiss sound off)
It's also the same site that's producing core dumps on resizing of large files with Imagick [webmasterworld.com].
| 6:48 pm on Jun 15, 2010 (gmt 0)|
Is Mod_Security enabled on this hosting?
| 12:44 am on Jun 16, 2010 (gmt 0)|
Having no other access, tested with
RewriteRule ^test.html$ enabled.html [L]
# other rules here
requesting test.html gives me the notfound page, so I'm going to say no unless there's a fault in my thinking (and verified twice enabled.html was present.)
| 5:14 pm on Jun 16, 2010 (gmt 0)|
Can you show us all the accept headers? Also, in Live HTTP Headers go to the "Generator" tab and make sure "Invalid" is checked.
| 8:28 pm on Jun 16, 2010 (gmt 0)|
Mod Security, a web application firewall, will return a 406 error by default. There could be a word or URL parameter that is triggering the filter. Certain mod security rule sets can be overly aggressive. I would contact your host to see if they are running mod security and send them the URL that triggers the error.
If they will not exempt you and you cannot change the URL, you may need to consider a move to a VPS so you can control your hosting environment more carefully.
| 6:52 pm on Jun 17, 2010 (gmt 0)|
|Mod Security ....you may need to consider a move to a VPS... |
Thank you jeff, but 1) is the previous .htaccess post not an accurate test for mod_security? and 2) as mentioned, "limited access" as a third party to this client. One of those situations requiring diplomacy. If I contact the host, the client gets the response, and says to second party, "what's wrong with my site . . . " So I need to be very sure before this goes to the next level.
|Can you show us all the accept headers? |
With invalid checked, edited the domain and cookie line only.
GET /test.html HTTP/1.1
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:188.8.131.52) Gecko/20100401 Firefox/3.6.3
Cookie: [removed, from a different site open in another tab]
HTTP/1.1 200 OK
Date: Thu, 17 Jun 2010 18:42:41 GMT
Server: Apache/2.2.11 (Unix) mod_ssl/2.2.11 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/184.108.40.20635
Keep-Alive: timeout=5, max=100
Though it's likely irrelevant, contents of "test.html" and the same file used in the .htaccess test above.
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<p>this means mod security is NOT enabled</p>
I'd bug the folks at the W3, but I imagine they would shrug and say "your client's server is serving a 406, donno what ta' tell ya' . . . "
| 3:59 pm on Jun 22, 2010 (gmt 0)|
Just wondering if the previous info shed any light, and if my .htaccess test is not a good indicator of the presence of mod_security.
| 5:18 pm on Jun 22, 2010 (gmt 0)|
Dunno. Nothing stands out. Have you asked your host or admin about your firewall and/or security 'filters'?
Otherwise, I think the choices are to either use a different validator, or to write a script to log all W3C request headers and install it on your server to help figure out why they're being blocked. I'm fond of client-side validators, as available in several free raw-HTML editors.
| 1:19 pm on Jun 23, 2010 (gmt 0)|
Nothing standing out in my mind either. Trying dropping an .htaccess directive in to turn off MultiViews, if it is on. Then try hitting the page from the validator. I'm just curious if that has anything to do with it.
Also, what JD said, drop a script in place to read and log the request from W3 as well as the Apache response headers.