Forum Moderators: DixonJones
I've tried many approaches, but there is "something funny going on" with these requests. So far, this is the first method that has worked. In .htaccess under Apache server:
RewriteRule umthi - [NC,F]
Jim
Jim
That's a possibility, but I don't know enough about the worm to consider that a good idea. I'd rather give them a 403, since a 200-OK might lead the requestor to believe that it was dealing with an already-infected machine or something.
No matter what the response code, the requestor is going to get a standard HTTP header with your server type and revision and all that.
Hopefully, some more info on this exploit will turn up.
Jim
That's a possibility, but I don't know enough about the worm to consider that a good idea.
Yeah, good point. I guess it was an instinctual "if they want a 404, give them somehing else," but without knowing exactly what the purpose of the request is it's impossible to know what the result -- or effectiveness -- of that might be.
Wouldn't something like:
RewriteCond whatever_cond_we_are_looking_for
RewriteRule ^.*$ [127.0.0.1...] [L]
or
RewriteRule ^.*$ [mydomain...] [L]
redirector.cgi just logs the request and sends a redirect
to [127.0.0.1...]
Would this redirect them back to their own server/computer without
returning headers from our own server and without logging unless we
used a redirector script?
Thanks
GeorgeGG
These agents typically do not follow redirects, and any response from our server is going to contain the standard info about our server. You can test your own server using the WebmasterWorld server header checker [webmasterworld.com] to see this info.
It would be nice to be able to "black hole" requests for this "sumthin" file, but that requires access to the server firewall - an option many of us don't have.
Jim
These agents typically do not follow redirectsThanks for that tibit of info...
requires access to the server firewallheck I only have a personal website :)
Thanks again
GeorgeGG