Forum Moderators: phranque

Message Too Old, No Replies

Apache Completely 403/deny blank useragents

         

JAB Creations

7:59 pm on Sep 21, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I know I've asked before how to send a custom 403 page for blank useragents but EVERY SINGLE BLANK UA IS SPAM LISTED! I do not want them being able to access anything for any reason! I know the previous post I got help frmo JD Apache would take a little while before it would actually 403 requests, don't know why? Is a referal not a direct request and only direct requests denied? If you completely turn off your UA while still downloading images from say some html file it I want to immediantly be able to deny. I know what I want to do, I'm just not sure how it is technically specified? I've been trying code out on a local install of Apache/2.0.54 but I keep screwing up my file.

So in short I want to NEVER satisfy ANY request from a blank useragent.

bcolflesh

8:06 pm on Sep 21, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



[webmasterworld.com...]

jdMorgan final response seems to be what you want - isn't it working for you?

JAB Creations

7:46 am on Sep 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No it is not. I'm testing with Apache/2.0.54 and then will be using the same code for Apache/1.3.33. All I have been able to do is with the code block listing of directories regardless of the UA and removing it having normal operations.

JAB Creations

8:28 am on Sep 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've got it half working and I've been minimizing the code on a local apache install and I keep getting this error...

[Fri Sep 23 04:22:53 2005] [error] [client 127.0.0.1] Options FollowSymLinks or SymLinksIfOwnerMatch is off which implies that RewriteRule directive is forbidden: C:/../htdocs/error/error-403.html

Now the problem is that while a direct request (go button) is sent a 403 message if you click links from page to page you can still browse normally...which is what I'm concerned about, I don't want ANY requests (direct or not) to be allowed with a blank UA.

I am seeing 304s in the logs...so I am assuming that caches of files are somehow allowing access to the files? (If that is the case then how would I block cacheing, which files (and in general a directory but a directory but without applying to files in) would I have to set to not cache?)

I'm a bit confused here...

JAB Creations

8:51 am on Sep 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Perhaps these spammers have a setup where they are able to trick Apache in to thinking the IP with the blank ua requesting the file already has it in the cache, allows the page to load...and therefor is able to follow links as they desire? I think that is how I would approach it if I was a scumbag. ~IF~ that is the case would I be able to disable cache requests from blank useragents and that would close that gap?

I keep updating this post (now I have to post a reply, eh)...though I think I've pinpointed it as good as I possibly can.

jdMorgan

10:35 pm on Sep 24, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Options FollowSymLinks or SymLinksIfOwnerMatch is off which implies that RewriteRule directive is forbidden:

That says it all, really.

Add Options +FollowSymLinks ahead of RewriteEngine on in your code.

You seem to be having an inordinate amount of trouble getting this usually-simple function to work. If you are seeing 403 responses in your logs, then your server is blocking access. By way of explanation, I should probably also mention that in order for you to actually 'see' what your server response is, you'll need to flush your own browser cache after making *any* change(s) to your access-control code; Once you've got a copy of a 'denied' page in your cache, your browser will no longer send subsequent requests for that page to your server, so results will look incorrect/strange. So either flush your cache often, or disable it while testing.

Jim