First, ARRRRGHH.
Thanks. I needed that.
These should be required reading for anyone trying to defend their server... (I just read them.)
Hackers Use MIT Server to Hack 100,000 Sites:
[
dailytech.com...]
Advice on dealing with the robots invoved:
[
securityweek.com...]
Now I've read them I find I have part of the answer to a couple of my prior questions.
It doesn't matter how small your site is, it just matters if you have a vulnerability. The tireless robots will find you. Some, by robodialing your IP number, even if you have no DNS entry or search engine visibility.
I've seen everything in the articles come through in my logs recently.
So. Now I know... don't trust an *.edu .. be worried by // ... keep a sharp eye for a muieblackcat directory ... setup, scripts myphpadmin, login, register ... even any random filename can refer to a an attempt to folloe up on a previously successful write deep inside your directory tree, in ANY directory (the more obscure, the better). And some images hotlinked to scraper sites. And so on.
A couple of years ago, a careless support person left a backup directory created with write permissions, none of which I was aware of until the effects of the malicious files a robot placed there came to light weeks later. Seconds later, the robot had spammed hundreds (if not thousands) of vulnerable forums and galleries with dozens of links back to those files. It still makes me sick to think of the grief all that caused.
A different attack, that happened I know not how took months of effort involved to stomp out the last infection.
As far as I know, those trials by fire are keeping me more alert. I hope by now my defense is better than the robot attacks.
Pardon excuse me, while I go outside and scream again.