homepage Welcome to WebmasterWorld Guest from 54.196.159.11
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / Apache Web Server
Forum Library, Charter, Moderators: Ocean10000 & incrediBILL & phranque

Apache Web Server Forum

    
New Apache response, 500, to the [F] flag in mod rewrite?
Web host says nothing wrong in their server configuration
thord




msg:3546488
 11:15 am on Jan 12, 2008 (gmt 0)

My web host (shared server) recently upgraded their hardware and Apache (now v. 2.2.3, formerly 2.0.47).
In order to deny access from certain referrers I have this line in my .htaccess (there are other rules too):

RewriteCond %{HTTP_REFERER} (example\.com) [NC]
RewriteRule .* - [F]

Previously this resulted in a status code 403 Forbidden, and everything else was fine too. Now it is resulting in a 500 Internal Server Error, and I do not like that. When I contacted the host they replied that their server configurations are correct and that maybe their new Apache version acts like that to the [F] flag.

Can someone please tell me if my mod_rewrite rule is somehow wrong or deprecated? Could some other rule or lack of rule in my .htaccess (unchanged) cause this 500 to denied URLs? If not, what precisely should I ask my host to check in their configuration and where?

 

jdMorgan




msg:3546679
 5:18 pm on Jan 12, 2008 (gmt 0)

Do you use a custom 403-Forbidden page?
Have you checked your server error log?

If using a custom 403 error page, you must allow even denied user-agents, remote hosts, or referrals to access that custom error page, otherwise, you'll get a cascade of 403 errors as the server tries to send that error page to a denied client. You must provide an exception to your rule to allow the 403 error page to be served in all cases.

If this is the case, you should see clear evidence of this problem in your server's error log file.

Just a guess...

I also suggest that you add an exception for robots.txt as well, since some robots will treat any error fetching robots.txt as carte-blanche to spider your entire site.

Jim

thord




msg:3546717
 6:37 pm on Jan 12, 2008 (gmt 0)

Yes, I can only admit that you are right. I do not have access to the error logs, but I did have a custom 403 page, so it is all clear. Obviously the previous Apache version somehow tolerated my error and returned a status 403, because this set up worked fine for years, until the web host updated.

I think it is better that I keep it simple, so I just removed the "ErrorDocument 403" line in the .htaccess. Only 404 and 500 are left. Hopefully this is a correct and sufficient solution. My test shows that this referrer now gets what it deserves, a 403.

Thank you.

jdMorgan




msg:3546721
 6:46 pm on Jan 12, 2008 (gmt 0)

Should you decide you'd like to use the custom 403 error page again, adding the exclusion is a simple matter of using a negative-match pattern:

RewriteCond %{HTTP_REFERER} example\.com [NC]
RewriteRule !^custom-403-error-page\.html$ - [F]

Jim

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / Apache Web Server
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved