I have a vanity website (not commercial, no ads) that has been on line over 23 years. Will not give it up, a vanity kind of thing, but in recent months has been attacked by bad actors from countries known for meddling, etc. and ... Have a mess of malformed inbound requests playing havoc in my logs.
Is there any downside in forcing 403?
For example, I do not use php, so
SetEnvIf Request_URI "\.php" ban
In recent weeks have added com net org it jp$ ru cn asp ua sa live life info etc
Bad bots have always been addressed in
SetEnvIfNoCase Referer "something" ban
Unfortunately the number of those has increased by double! (I have a robots.txt and those that play nice are not included in the ban). That said, robots.txt is always passed...
The Deny from list (by ip) is becoming increasingly more significant as these are country specific---the most egregious (as in covering my content by theft). I have taken to using:
xxx
and
xxx.xxx or (if there's no big change)
xxx.xxx.xxx
Realize I am stuck in the stone age as regards IPv6 ... but my logs, so far, are only returning IPv4...
This last part of my question is where I am a stranger. I have yet to wrap this aging brain around CDIR ranges, etc. or where I might find country specific range lists that I can "cut and paste". MEANWHILE, I am educating myself the HARD WAY (looking up IP addresses) and hope to get there dang quick.
Just checking to see if I am headed off track in dealing with this problem.
Any advice (or raspberries for being clueless) is welcome.