Welcome to WebmasterWorld Guest from 22.214.171.124
[Sun Feb 9 09:07:28 2003] [warn] Apache does not support line-end comments.
I am assuming this warning occurs because the spider trap writes lines to .htaccess like this:
print HTACCESS ("SetEnvIf Remote_Addr \^$remote_addr\$ getout \# $date $remote_agent\n");
So rather than taking out the helpful info at the end, can I make Apache stop spitting out those errors?
On this server I do NOT have access to httpd.conf
I suppose I could put \n before the # to make it on yet another line, but it seems like its a waste
since Apache is obeying the line anyway
I have almost exactly the same thing on my server, and do not have this problem (?)
print HTACCESS ("SetEnvIf Remote_Addr \^$regaddr\$ getout \# $date $usragnt\n");
<added>Maybe my hosting company has the error logs set so that warnings are not shown...</added>
This one does, and I can't see or edit httpd.conf unfortunately.
I think its a change in Apache from version 1.3.20 to 1.3.22
I found this in the 1.3.22 changelog
The main new features in 1.3.22 (compared to 1.3.20) are:
The server will now display a warning if line-end comments (#) are found in the configuration file.
Not all directives are able to handle comments on the same line
But there is no talk about a disable for it, and if I will be able to use
such a directive in HTACCESS vs httpd.conf
I suspect that your server which is behaving differently has its LogLevel set to warn, rather than error. LogLevel is part of the Apache core, but is accessible only in server and virtual host config. :(
...Pretty strange that it issues a warning even though the script and the directives that the script writes demonstrably work fine... I've got too many dead spiders lying around here to believe that this system isn't working. :)
I guess you'll have to put up with the error entries, or add the \n as you stated.
I bet its at Loglevel warn which is the default when I'd rather have it at Loglevel crit
Right now someone is pounding that site, probably from an old url list,
and they violated robots.txt, got banned, but the errorlog file is now
radically growing in size (hundreds of megs)!
I'll have to question the admin and get them to knock loglevel up a notch.