Forum Moderators: open
If we lived in a world actually concerned with cleaning those machines, we could send the ISPs those IPs and they'd shut them down until they were cleaned but nope, they prefer letting their customers be unwitting parts of an underground network.
If we lived in a world actually concerned with cleaning those machines, we could send the ISPs those IPs and they'd shut them down until they were cleaned but nope, they prefer letting their customers be unwitting parts of an underground network.
keyplyr wrote:
but I have successfully blocked many attacks.
I'm not sure what you mean by "attacks"
So two questions:
<snip>
Sometimes prior to the scrapes/attacks, the compromised accounts are tested (YMMV.)
• One particular page is requested w/ no other supporting files, no other requests. This is to 1.) check to see if the compromised IP addresses is still valid, and 2.) evaluate the victim's server response.
• Different bad actors may request completely different pages for much the same reason.
if they don't and you're using a whitelist approach to blocking, then it'd seem easier to block them.
wilderness wrote:
Nearly all bots (malicious or otherwise) will make some kind of initial (SOFT) probe/test, and then return later with a more extensive crawl.
If a webmaster is able to deter the visits after the initial probe, than generally the more extensive crawl never takes place.
So in this particular case, although I've blocked every single fetch attempt for the past 3 months, that hasn't stopped new fetch attempts from new IPs from being made at the same rate as previously, about 200 per day. So it's wrong to think that you can always stop future fetch attempts by blocking current ones. That simply isn't true.
But although I've been using this code to successfully block every single fetch attempt from this botnet for more than 3 months, that hasn't stopped or even slowed down the activity
So I started a thread in the Apache forum asking if this could be done, and Lucy responded that, yes, it could be, and that she had already done it herself in similar cases. She then posted some simple .htaccess code that does the trick.
Rather than looking at IP's your going to be required to implement UA denies (or even headers).
I have the impression that those attacks didn't come from a botnet
lucy was kitty-footing around
It is not supposed to prevent attempts, it is supposed to prevent success.
most people are careless about their computer' security and happily click on links leading to wealth untold or bigger and better