Forum Moderators: phranque
[Wed Sep 26 14:21:31 2007] [error] script '/public_html/adxmlrpc.php' not found or unable to stat
[Wed Sep 26 14:21:32 2007] [error] File does not exist: /public_html/adserver
[Wed Sep 26 14:21:32 2007] [error] File does not exist: /public_html/phpAdsNew
[Wed Sep 26 14:21:32 2007] [error] File does not exist: /public_html/phpadsnew
[Wed Sep 26 14:21:33 2007] [error] File does not exist: /public_html/phpads
[Wed Sep 26 14:21:33 2007] [error] File does not exist: /public_html/Ads
[Wed Sep 26 14:21:34 2007] [error] File does not exist: /public_html/ads
[Wed Sep 26 14:21:34 2007] [error] script '/public_html/xmlrpc.php' not found or unable to stat
[Wed Sep 26 14:21:34 2007] [error] File does not exist: /public_html/xmlrpc
[Wed Sep 26 14:21:35 2007] [error] File does not exist: /public_html/xmlsrv
[Wed Sep 26 14:21:35 2007] [error] File does not exist: /public_html/blog
[Wed Sep 26 14:21:36 2007] [error] File does not exist: /public_html/drupal
The bots are probably looking for common exploits, so the files and folders they look for are usually the same. So I'm wondering if I can somehow add a list to .htaccess that totally ignores a request to any file or folder on the list. By "ignore" I mean return no response code at all.
If this can be done and someone can provide an example, or a link to an example, it would be much appreciated.
Thanks!
For example, you can set up a subdirectory called something like "/pests" and put a tiny (or empty) custom 403 and/or 404 error page in it, along with a .htaccess file to declare that custom error page. Then, when you get a request for one of these bogus URLs, use the main .htaccess file to rewrite it to that subdirectory.
In this way, any request for the bogus pages returns your tiny custom error page instead of a larger error response containing your 'real' custom error page.
You can also configure your server to disable keep-alive when the "/pests" 404 or 403 page is requested, in order to minimize wasted active server threads. (This can be done with a mod_headers directive in a <Files> container in /pests/.htaccess, for example.)
Jim
If you cannot configure a firewall rule to block the request before it reaches your server, all you can do is to minimize the wasted bandwidth.
OK, thanks for the info. Basically I was hoping that I could use .htaccess to drop requests, just like a firewall does.
It's too bad that can't be done, because sending back a response is giving the hacker information. If the response is a 403 or 404, the hacker knows to move on. OTOH, if I send them a blank file, they'll get a 200 response code and that might encourage them to stick around. I think the best response would be nothing, so they don't know whether or not the file exists. Kind of like sending spam into a black hole so the spammer doesn't know if their email was actually delivered, or rejected. Oh well... can't win em all.