Forum Moderators: phranque

Message Too Old, No Replies

server hacked

Lot of server activity from cracker clients

         

craigt

9:10 pm on Nov 23, 2014 (gmt 0)

10+ Year Member



Good afternoon.

My error log had the following entries this morning.

[Sat Nov 22 05:42:15 2014] [error] [client 222.216.28.248] script not found or unable to stat: C:/usr/www/example/cgi-bin/php5
[Sat Nov 22 05:42:16 2014] [error] [client 222.216.28.248] script not found or unable to stat: C:/usr/www/example/cgi-bin/php-cgi
[Sat Nov 22 05:42:17 2014] [error] [client 222.216.28.248] script not found or unable to stat: C:/usr/www/example/cgi-bin/php.cgi
[Sat Nov 22 05:42:18 2014] [error] [client 222.216.28.248] script not found or unable to stat: C:/usr/www/example/cgi-bin/php4
[Sat Nov 22 13:11:24 2014] [error] [client 189.171.48.1] File does not exist: C:/usr/www/example/gege
[Sat Nov 22 13:11:25 2014] [error] [client 189.171.48.1] File does not exist: C:/usr/www/example/phpMyAdmin
[Sat Nov 22 13:11:26 2014] [error] [client 189.171.48.1] File does not exist: C:/usr/www/example/pma
[Sat Nov 22 13:11:27 2014] [error] [client 189.171.48.1] File does not exist: C:/usr/www/example/myadmin
[Sat Nov 22 22:55:20 2014] [error] [client 1.2.172.71] File does not exist: C:/usr/www/example/bsbs
[Sat Nov 22 22:55:21 2014] [error] [client 1.2.172.71] File does not exist: C:/usr/www/example/phpMyAdmin
[Sat Nov 22 22:55:22 2014] [error] [client 1.2.172.71] File does not exist: C:/usr/www/example/pma
[Sat Nov 22 22:55:23 2014] [error] [client 1.2.172.71] File does not exist: C:/usr/www/example/myadmin
[Sun Nov 23 00:56:40 2014] [error] [client 218.164.97.122] File does not exist: C:/usr/www/example/ntnt
[Sun Nov 23 00:56:41 2014] [error] [client 218.164.97.122] File does not exist: C:/usr/www/example/phpMyAdmin
[Sun Nov 23 00:56:42 2014] [error] [client 218.164.97.122] File does not exist: C:/usr/www/example/pma
[Sun Nov 23 00:56:43 2014] [error] [client 218.164.97.122] File does not exist: C:/usr/www/example/myadmin
[Sun Nov 23 06:48:05 2014] [error] [client 66.135.34.113] script not found or unable to stat: C:/usr/www/example/cgi-bin/php.exe
[Sun Nov 23 06:48:06 2014] [error] [client 66.135.34.113] script not found or unable to stat: C:/usr/www/example/cgi-bin/php5.exe
[Sun Nov 23 06:48:06 2014] [error] [client 66.135.34.113] script not found or unable to stat: C:/usr/www/example/cgi-bin/php-cgi.exe
[Sun Nov 23 06:48:07 2014] [error] [client 66.135.34.113] script not found or unable to stat: C:/usr/www/example/cgi-bin/cgi.exe
[Sun Nov 23 06:48:07 2014] [error] [client 66.135.34.113] script not found or unable to stat: C:/usr/www/example/cgi-bin/php4.exe

My access log had the following entries.

1.2.172.71 - - [22/Nov/2014:22:55:20 -0500] "GET /bsbs/bsb/bs.php HTTP/1.1" 404 318
1.2.172.71 - - [22/Nov/2014:22:55:21 -0500] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 404 331
1.2.172.71 - - [22/Nov/2014:22:55:22 -0500] "GET /pma/scripts/setup.php HTTP/1.1" 404 324
1.2.172.71 - - [22/Nov/2014:22:55:23 -0500] "GET /myadmin/scripts/setup.php HTTP/1.1" 404 328
218.164.97.122 - - [23/Nov/2014:00:56:40 -0500] "GET /ntnt/ntn/nt.php HTTP/1.1" 404 318
218.164.97.122 - - [23/Nov/2014:00:56:41 -0500] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 404 331
218.164.97.122 - - [23/Nov/2014:00:56:42 -0500] "GET /pma/scripts/setup.php HTTP/1.1" 404 324
218.164.97.122 - - [23/Nov/2014:00:56:43 -0500] "GET /myadmin/scripts/setup.php HTTP/1.1" 404 328
212.83.138.153 - - [23/Nov/2014:02:04:26 -0500] "GET / HTTP/1.1" 200 15675
157.55.39.6 - - [23/Nov/2014:02:56:37 -0500] "GET /robots.txt HTTP/1.1" 200 430
157.55.39.5 - - [23/Nov/2014:02:56:58 -0500] "GET / HTTP/1.1" 200 2802
104.192.0.19 - - [23/Nov/2014:06:26:10 -0500] "GET / HTTP/1.0" 200 15678
66.135.34.113 - - [23/Nov/2014:06:48:05 -0500] "GET //cgi-bin/php.exe HTTP/1.1" 404 263
66.135.34.113 - - [23/Nov/2014:06:48:06 -0500] "GET //cgi-bin/php5.exe HTTP/1.1" 404 263
66.135.34.113 - - [23/Nov/2014:06:48:06 -0500] "GET //cgi-bin/php-cgi.exe HTTP/1.1" 404 264
66.135.34.113 - - [23/Nov/2014:06:48:07 -0500] "GET //cgi-bin/cgi.exe HTTP/1.1" 404 262
66.135.34.113 - - [23/Nov/2014:06:48:07 -0500] "GET //cgi-bin/php4.exe HTTP/1.1" 404 264
104.236.27.63 - - [23/Nov/2014:07:03:16 -0500] "GET /parts/brief.html HTTP/1.1" 200 2166
178.62.214.203 - - [23/Nov/2014:07:03:30 -0500] "GET /shom3ifrm.html HTTP/1.1" 200 326
104.236.27.69 - - [23/Nov/2014:07:03:42 -0500] "GET /shom4.html HTTP/1.1" 200 484
198.211.117.78 - - [23/Nov/2014:07:03:52 -0500] "GET /m1demo/m1.htm HTTP/1.1" 200 587
162.243.1.48 - - [23/Nov/2014:07:03:54 -0500] "GET /parts/m3.html HTTP/1.1" 200 1888
198.199.68.18 - - [23/Nov/2014:07:04:15 -0500] "GET /shom2ifrm.html HTTP/1.1" 200 325
104.131.146.120 - - [23/Nov/2014:07:04:16 -0500] "GET /parts/acks.html HTTP/1.1" 200 2004
95.85.39.206 - - [23/Nov/2014:07:04:30 -0500] "GET /parts/m1.html HTTP/1.1" 200 2102
128.199.232.11 - - [23/Nov/2014:07:05:29 -0500] "GET /docs/scdoce.doc HTTP/1.1" 200 16202
178.62.219.89 - - [23/Nov/2014:07:05:39 -0500] "GET /parts/potential.html HTTP/1.1" 200 1095
104.131.135.7 - - [23/Nov/2014:07:05:39 -0500] "GET /parts/features.html HTTP/1.1" 200 1710
104.236.27.65 - - [23/Nov/2014:07:05:40 -0500] "GET /parts/addedvalue.html HTTP/1.1" 200 988
162.243.164.227 - - [23/Nov/2014:07:05:40 -0500] "GET /parts/roi.html HTTP/1.1" 200 1253
104.236.27.68 - - [23/Nov/2014:07:06:04 -0500] "GET /parts/di.html HTTP/1.1" 200 638
188.226.169.215 - - [23/Nov/2014:07:06:05 -0500] "GET /parts/priceom.html HTTP/1.1" 200 572
178.62.158.69 - - [23/Nov/2014:07:06:06 -0500] "GET /m3demo/m3.htm HTTP/1.1" 200 373
192.241.248.155 - - [23/Nov/2014:07:06:07 -0500] "GET /parts/m2.html HTTP/1.1" 200 1768
162.243.226.174 - - [23/Nov/2014:07:06:27 -0500] "GET /shodhtml2.html HTTP/1.1" 200 332
178.62.99.54 - - [23/Nov/2014:07:06:28 -0500] "GET /parts/company.html HTTP/1.1" 200 1130
128.199.154.245 - - [23/Nov/2014:07:06:29 -0500] "GET /parts/idea.html HTTP/1.1" 200 3510
178.62.152.120 - - [23/Nov/2014:07:06:39 -0500] "GET /sge.html HTTP/1.1" 200 258
104.131.146.120 - - [23/Nov/2014:07:08:16 -0500] "GET /m2demo/m2.htm HTTP/1.1" 200 373
162.243.1.48 - - [23/Nov/2014:07:08:28 -0500] "GET /parts/ii.html HTTP/1.1" 200 1004
125.64.35.67 - - [23/Nov/2014:07:54:10 -0500] "GET http://6.example.cn/zc/chs/img/body.png HTTP/1.1" 404 259

My static IP starts with 72. I think I'm being hacked. These IPs are from all over the globe. Looks like they are probing my server and executing parts of the website that this server hosts. And I'm getting this kind of server activity quite often based on the logs. These people don't have much to do.

Would someone please comment on what they see here, what could happen or my potential exposure, and what I should do to prevent any destructive behavior. The application the server hosts is simply an idea of my own design and development, not hardly of any interest to a cracker I would think.

I'm on Windows 7 platform using Apache/2.0.64 (Win32,) mod_perl/2.0.3, and Perl/v5.8.3. I work with the firewall down because my application does not seem to be visible to the WWW with it up (probably my understanding). I run MSE all the time and MalwareBytes regularly.

Thanks.

[edited by: phranque at 4:05 pm (utc) on Nov 25, 2014]
[edit reason] exemplified domain [/edit]

wilderness

2:16 pm on Nov 25, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think I'm being hacked.


hardly

These IPs are from all over the globe.


It's normal for visitor-activity to be global on any website.

Please note; two of the lines are actually the MSN/Bing bot.

Looks like they are probing my server and executing parts of the website that this server hosts.


The first half of your postings are PHP requests/probes that we are all getting.
The 404 result is normal when the files do not exist.

Would someone please comment on what they see here, what could happen or my potential exposure,


The long range exposure (vulnerability)is that these pests will continue to return because your not taking a corrective action to prevent their return (i. r., denied access or 403).

and what I should do to prevent any destructive behavior.


Most of these IP's are Server Farms.
Please see the many Server Farm Threads in the SSID Forum.
You need to explore the option of restricting visitors to your site (s), based upon what is beneficial or detrimental to your site (s).

FWIW, it appears to me that your lacking in the experience to determine what visitors within your access logs are beneficial and/or detrimental.
It's a learned process, and there is not a copy and paste solution.

craigt

2:29 pm on Nov 25, 2014 (gmt 0)

10+ Year Member



Thanks for the reply wilderness. Your right about that experience.

not2easy

5:14 pm on Nov 25, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Skipping past the top part which is all [error] listings, you can see 14 more 404 (not found) errors:
1.2.172.71 - - [22/Nov/2014:22:55:20 -0500] "GET /bsbs/bsb/bs.php HTTP/1.1" 404 318
1.2.172.71 - - [22/Nov/2014:22:55:21 -0500] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 404 331
1.2.172.71 - - [22/Nov/2014:22:55:22 -0500] "GET /pma/scripts/setup.php HTTP/1.1" 404 324
1.2.172.71 - - [22/Nov/2014:22:55:23 -0500] "GET /myadmin/scripts/setup.php HTTP/1.1" 404 328
66.135.34.113 - - [23/Nov/2014:06:48:05 -0500] "GET //cgi-bin/php.exe HTTP/1.1" 404 263
66.135.34.113 - - [23/Nov/2014:06:48:06 -0500] "GET //cgi-bin/php-cgi.exe HTTP/1.1" 404 264
66.135.34.113 - - [23/Nov/2014:06:48:06 -0500] "GET //cgi-bin/php5.exe HTTP/1.1" 404 263
66.135.34.113 - - [23/Nov/2014:06:48:07 -0500] "GET //cgi-bin/cgi.exe HTTP/1.1" 404 262
66.135.34.113 - - [23/Nov/2014:06:48:07 -0500] "GET //cgi-bin/php4.exe HTTP/1.1" 404 264
125.64.35.67 - - [23/Nov/2014:07:54:10 -0500] "GET http://6.example.cn/zc/chs/img/body.png HTTP/1.1" 404 259
218.164.97.122 - - [23/Nov/2014:00:56:40 -0500] "GET /ntnt/ntn/nt.php HTTP/1.1" 404 318
218.164.97.122 - - [23/Nov/2014:00:56:41 -0500] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 404 331
218.164.97.122 - - [23/Nov/2014:00:56:42 -0500] "GET /pma/scripts/setup.php HTTP/1.1" 404 324
218.164.97.122 - - [23/Nov/2014:00:56:43 -0500] "GET /myadmin/scripts/setup.php HTTP/1.1" 404 328


These look like automated bots scanning for vulnerabilities in your setup. You may want to block those IPs from accessing your site (I would). Don't bother with blocking one IP to stop bots, they just come in with another IP on the same server. For example, if you do a whois lookup for "66.135.34.113" you can see that it is coming from "ServerBeach" so you know it is not a person visiting your site. Whois gives you this information:
66.135.32.0 - 66.135.63.255
CIDR: 66.135.32.0/19
Organization: ServerBeach (SERVER-17)

So to block that robot and any others from that server you use:
deny from 66.135.32.0/19

in your list of blocked IPs.

The rest in your list might appear to be visitors, except they only request ".html" files, no css, no images, no .js files. That's not normal unless the logs are set up to only log certain file types. If your logs don't log any images, css or other requests I would work on getting them to show that because otherwise you can't tell human visits from bot visits quite so easily. In the meantime, do whois lookups and see how many of those might be people - and read around the Library and Charter of this forum to help you get started with blocking unwanted traffic.

craigt

6:42 pm on Nov 25, 2014 (gmt 0)

10+ Year Member



Thanks not2easy. I'll take your suggestion.

lucy24

6:53 pm on Nov 25, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would work on getting them to show that

LogFormat and LogLevel can't be changed in htaccess, so this would involve changing hosts (a legitimate solution if logs really are that restricted-- but I honestly doubt that's the case).

No two people handle blocking in the same way. But there are two basics:
-- locking out IP ranges unconditionally using mod_authz-thingy:
Deny from 12.3.etcetera
You can then use <Files> or <FilesMatch> envelopes to let everyone read selected files such as robots.txt
-- more sophisticated lockouts looking at other aspects of the request, using mod_setenvif and/or mod_rewrite (your host may also use mod_security as an optional extra)

That's the "how" part. The "who" part also depends on your individual preferences. For example, you might block selected countries, or block all .php requests (if your URLs never end in php), or block elderly browsers. Nobody can make these decisions for you. But if you narrow it down to "I want to ban such-and-such visitors" we can sometimes help you work out how to do it.

not2easy

9:05 pm on Nov 25, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I put it that way because it sounded like it was self hosted:
I'm on Windows 7 platform using Apache/2.0.64 (Win32,) mod_perl/2.0.3, and Perl/v5.8.3. I work with the firewall down because my application does not seem to be visible to the WWW with it up (probably my understanding). I run MSE all the time and MalwareBytes regularly.
and I thought it might be a personal preference setting that could be adjusted.

lucy24

9:32 pm on Nov 25, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Oops, my bad, missed that part.

:: detour to Apache docs ::

Well, that was useful. Turns out that my logs (probably most people's logs on shared hosting) use Combined Log Format. That's the one that includes the referer and UA, each in quotation marks. Technically it's a nickname that can be defined at will, but further detour to MAMP suggests that the names "common" and "combined" are part of Apache config boilerplate.

It looks as if conditional logs (generically) require some extra business involving mod_setenvif, for example
SetEnvIf Request_URI "\.(jpg|css|png|js)$" dontlog
CustomLog logs/access_log common env=!dontlog

But you'd be bonkers to use it on a new site, because then how would you identify the robots?

craigt

4:40 pm on Nov 26, 2014 (gmt 0)

10+ Year Member



Thanks for all the responses. I have a <directory block defined in my httpd.conf for my server root. It currently allows all. I'm thinking that I can modify that as follows to allow all reguests from my redirected IP service for any part of my website code and for the robots.txt file. PHP will not execute on my server.

Order deny,allow
Deny from all
Allow from example.no-ip robots.txt

What else should I include?

I don't want to deny access by IP address because requests to look at a part of my website could come from anywhere, client wise. But I want the requests to refer to the server alias used by the redirected IP service I employ - example.no-ip.

What about automatic maintenance by MS, Oracle, Adobe, and the like?

My robots.txt file excludes all subdirectories by name and allows indexing of only html files beginning with 'o'. It's been very effective based on keyword search results.

Any other considerations that you all typically include or that might be relevant in these deny,allow definitions?

Thanks again for all the help.

[edited by: phranque at 7:13 pm (utc) on Nov 26, 2014]
[edit reason] no specifics, please [/edit]

lucy24

9:26 pm on Nov 26, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't want to deny access by IP address because requests to look at a part of my website could come from anywhere, client wise.

This really is a brand-new site, isn't it? You will start singing a different tune once you discover just how many Ukrainian robots and Chinese scrapers stop by with nothing but malign intentions. If you've got globetrotting administrators, you'll want some form of authentication involving a "Satisfy" directive to override IP-based lockouts. (Careful! There have been some substantive changes in this area between 2.2 and 2.4, though so far 2.4 should be backward-compatible.)

Allow from example.no-ip robots.txt

Can you do this in 2.2? I thought you needed a <Files> envelope. Also think twice about using any non-numerical argument in your Allow/Deny directives, since it throws the whole server into Lookups mode.

phranque

7:30 am on Nov 27, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



welcome to WebmasterWorld, craigt!


the Allow Directive [httpd.apache.org] typically takes an IP and/or hostname specification.

wilderness

1:41 pm on Nov 27, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



FWIW, word of caution when using a "hostname specification"!

It does result in an output change in your logs FORMAT, changing normal IP's to names, and basically making extra work for the webmaster.

lucy24

7:29 pm on Nov 27, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



making extra work for the webmaster

It also makes more work for the server, because it has to perform a lookup on every single IP. And all it takes is one non-CIDR item on the list. Earlier this year it happened to me by accident after I'd added one superfluous comma-- literally-- to a "Deny from" list.

craigt

4:35 pm on Nov 28, 2014 (gmt 0)

10+ Year Member



Thanks all for the direction. You all have really prodded me to begin to think about managing my server.

As I said earlier, I'm working on an idea of my own. My website presents the idea. So the server is being employed to serve the website that presents the idea I've been developing. Its been more of a design/development/test environment until now.

I've been focused on application and website development. I've worked through the transition to a new OS version (xp to 7) while I've been doing this work. But the server, language, and language performance sets have been rock solid. The server install was pretty generic with a few exceptions like mod_perl. I have had some Malware problems, probably because my firewall has been down and my server is not set up with security in mind, as you all have pointed out.

I need to study the Windows firewall and Apache in a little more depth. I am using Apache 2.0.64 and have been looking at the 2.0 docs.

Thanks again everyone for you generosity. I work pretty much alone, and this kind of experience is invigorating. It's almost like having company.

lucy24

8:22 pm on Nov 28, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am using Apache 2.0.64

Make sure you mention this in any future posts. I generally assume 2.2 unless explicitly told otherwise.

craigt

4:47 pm on Nov 29, 2014 (gmt 0)

10+ Year Member



Do these crackers use static IPs? I've put an IP specific approach in my conf file and am scanning the access and error logs daily for potentially wayward IPs. This allows the 'good' bots and any IP I haven't denied I believe. This sounds right for me.

Order deny,allow
allow from all
Deny from 207.46.132.96 etc.

But I can see this is going to be a long list before too long. Is this growing list a common web master experience?

Can I identify IP blocks that cracker organizations employ in any way other than looking for recurring patterns in the high order parts of IPs in my error and access logs? Is there perhaps an organizations that tracks cracker IPs or IP blocks?

wilderness

6:05 pm on Nov 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



207.46.132.96


this is a bad practice (denying to the precise Class D.
FWIW, this this the MSN/Bing bot.

If you wish to deny that MSN/Bing bot than use
207.46.0.0/16 or 207.46. (both accomplish the same thing)

Whois (do a google on WHOIS) gives you this information:


Don't bother with blocking one IP to stop bots, they just come in with another IP on the same server. For example, if you do a whois lookup for "66.135.34.113" you can see that it is coming from "ServerBeach" so you know it is not a person visiting your site. Whois gives you this information:

So to block that robot and any others from that server you use:
deny from 66.135.32.0/19


you were previously provided with the above example.

wilderness

6:08 pm on Nov 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Can I identify IP blocks that cracker organizations employ in any way other than looking for recurring patterns in the high order parts of IPs in my error and access logs? Is there perhaps an organizations that tracks cracker IPs or IP blocks?


you were also previously provided with the following:

FWIW, it appears to me that your lacking in the experience to determine what visitors within your access logs are beneficial and/or detrimental.
It's a learned process, and there is not a copy and paste solution.

lucy24

9:27 pm on Nov 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For future reference I looked it up. Apache 2.2 has mod_authz_host. This didn't exist in 2.0; they used (emphasis theirs)
This document refers to the 2.0 version of Apache httpd, which is no longer maintained. Upgrade, and refer to ...

... Huh. It's still mod_access. I thought that was only 1.3, possibly because a lot of sites seem to have jumped straight from 1.3 to 2.2. But the syntax is essentially the same; only the name has changed.

One of the unchanged parts is:

Allow from all|host|env=[!]env-variable [host|env=[!]env-variable] ... 

That means you can list more than one host/IP/environment in the same line, but they're parallel, not conditional. So if you said
Allow from qiniq.ca robots.txt

it's exactly the same as
Allow from qiniq.ca
Allow from robots.txt

What you meant to say was
<Files "robots.txt">
Order Allow,Deny
Allow from all
</Files>

craigt

7:48 pm on Nov 30, 2014 (gmt 0)

10+ Year Member



What I'm doing is examining my Apache logs, identifying what looks to me to be suspicious activity on my server like a client trying to execute PHP stuff or running an unblock of some kind, and reading what Whois says about the client. Based on thst, I'm deciding to or not to block the client or client block. I'm allowing all but those I block. I've also added the robots.txt envelope.

I would like to block PHP activity. I'm not using PHP at this time. What else. Please nudge me in any which way you all feel I could benefit from.

lucy24

10:33 pm on Nov 30, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would like to block PHP activity. I'm not using PHP at this time.

I think I said earlier that the exact "who" rules are site-specific. For example, I've got a rule that looks like this:

RewriteCond %{HTTP_REFERER} example\.com/?$
RewriteCond %{REQUEST_URI} !index\.html
RewriteCond %{REQUEST_URI} !/boilerplate/
RewriteRule ^([^/.]+/)+[^/.]+(\.html|/)$ - [F,NS]

It means: Block any request for an internal file if it gives the root ("example.com") as referer. On this site, no deeper file outside of the /boilerplate/ directory is linked directly from the front page. And robots often give your root as a generic referer to allay suspicion. (Other robots use auto-referers, which are harder to block in htaccess alone.) The [NS] flag is to bypass includes; the index.html exception is because the index redirect-- like most redirects-- comes after the [F] rules.

And here's my own php rule:

RewriteCond %{THE_REQUEST} \.php
RewriteCond %{REQUEST_URI} !(Eskimo|fun/panda)
RewriteRule \.php - [F,NS]

Again, the [NS] flag is to bypass includes-- the only place I normally use php. (This probably makes the first Condition redundant, but it's well to be safe.) And then the two specific files named in the second Condition are the other exceptions. If someone comes in asking for .php they're up to no good, even if they happen to come from a previously unfamiliar IP, so I'd prefer to hit them with an immediate 403 instead of defaulting to 404.

craigt

8:11 pm on Dec 1, 2014 (gmt 0)

10+ Year Member



Thanks lucy24, wilderness, not2easy, and phranque for helping me think about closing my exposure.

I'm allowing all, except IPs and IP blocks I specifically deny and .php processes. I also allow access to robots.txt. I'm examining my logs on a periodic basis looking for suspicious client activity. I'm using Whois to examine suspicious clients. On the basis of the logs and Whois, I'm deciding to or not block the IP or IP block.

I'm going to look at upgrading Apache and learning about what I upgrade to. Thanks again all. I'll be back I'm sure.