Welcome to WebmasterWorld Guest from 54.80.198.173

Forum Moderators: open

Message Too Old, No Replies

Recent probing of my site

Newbie webmaster seeking wisdom

     
12:28 am on Jul 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


Recently I have noticed a number of consistently similar probings of my website from different IP addresses all over the world. The script used contains 105 "targets", starting with /mysql/admin/, /mysql/dbadmin/, /mysql/sqlmanager/, ... and ending with /phpmyadmin2018/, and /phpmanager/. The list of 105 targets is always the same. The attempted attacks come two or three times each day and they are always directed at the non-SSL (http) version of my domain's IP address, and of course they get nothing because none of those directories exist on my site. I have not noticed any repeat attackers. I expect this is just script kiddies using the "latest" script that is going around and I have been ignoring them except for reporting their IP addresses to Project Honeypot when I have the time.

Is there anything, obvious or less so, that I should be aware of, or guarding against, regarding these probing events? (Maybe I should be more concerned about the "scrapers" and hot-linkers.) Thanks for any insight.
1:22 am on July 29, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11758
votes: 738


Hi Martin Potter and welcome to WebmasterWorld [webmasterworld.com]

These are common vulnerability probes to see if your site has any open doors. If any are found, a different agent from a different IP address will likely do the malicious actions.

Many of these probes will come from compromised servers or ISP accounts and will likely be fixed in time, so blocking is probably futile.

If your site runs a CMS that uses these files, make sure you keep your software up-to-date. If you do not use these files, just ignore the requests and let your server return a 403 File Not Found.

However, many webmasters feel it's proactive to block [webmasterworld.com] all server farm ranges:
2:11 am on July 29, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3761
votes: 206


Are you seeing these requests in your access logs or is this information taken from some stats/analytics service? I ask because the more complete information would be seen in your raw access logs.

A few current discussions about this kind of 'traffic' are here: [webmasterworld.com...] and here: [webmasterworld.com...] that could help with more information.
6:13 am on July 29, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:14780
votes: 635


let your server return a 403 File Not Found.

:: tweet tweet, chirp chirp ::

In some ways a 404 is the very best thing to send unwanted visitors, because it conveys absolutely no information. A 403 says "we're onto you"; a 404 says nothing. If possible, return the 404 manually so your server doesn't have to waste time looking for nonexistent files. You might do it for URLs in .php, or ones beginning in /mysql, or ones containing strings like "wp" or "admin"--anything that doesn't occur naturally on your site, or that doesn't form part of visible URLs. In fact If I Had It To Do Over Again I would call my /includes/ directory something else, and then any and all requests for /includes could be hit with a resounding 404.
12:06 pm on July 29, 2017 (gmt 0)

Preferred Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts: 356
votes: 33


I usually ban these security pokes. Where there's a penetration test it can be followed by a hack. Do they have a common UA? There are some well known pen test tools out there. I use a couple.
7:50 pm on July 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


Hi, keyplyr, thanks for the welcome.
If your site runs a CMS that uses these files

No I don't run any of the services they are looking for, so am pretty safe there. But they are addressing the http address of my site, all of their requests get a "301" response which, so far, they have ignored.
7:54 pm on July 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


Are you seeing these requests in your access logs

Yes, in the logs. Sometimes in cPanel's Visitors log but usually in the raw access logs. As you say, more info there. :-)
7:58 pm on July 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


In some ways a 404 is the very best thing to send unwanted visitors, because it conveys absolutely no information.

Lucy24, good idea, thanks for that. I will change what I can in the htaccess file. Thanks!
8:03 pm on July 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


Do they have a common UA?

Good point, TorontoBoy, and I should have mentioned that. The UA is always :
"Mozilla/5.0 Jorgee"
Is this new or has it been around for a while?

P.S. - I am near Ottawa. :-)
8:05 pm on July 29, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11758
votes: 738


No I don't run any of the services they are looking for, so am pretty safe there. But they are addressing the http address of my site, all of their requests get a "301" response which, so far, they have ignored.
As I said, these vulnerability scans are common. I see them every day. If you don't have these files, just ignore the requests and let your server return a 403 File Not Found.

However the requests won't stop. This is not a smart bot, this is only a script checking to see if these files are accessible. Returning a 403 will likely deter the second bot that would come if an exploit was actually found.
8:07 pm on July 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


not2easy, thanks very much for the links. Good stuff there and I will go back to them again to absorb more. (Brain is a little slow at my age, you know.) Thanks again.
8:13 pm on July 29, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3761
votes: 206


all of their requests get a "301" response which, so far, they have ignored.
this means only that they are requesting the domain at a default URL (such as http://example.com/vulnerability-whatever ) which is being redirected to the correct syntax (such as https://www.example.com/vulnerability-whatever) which then causes the 404 response.

The server is probably serving them the server's default 404 page - the same page you would see if you typed in a made-up non-existing page name in your browser. Unless you wanted to create a more user friendly 404 page, that will give them the message. Whenever they next work with configuring they may decide to not bother with your domain.
8:25 pm on July 29, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


As I said, these vulnerability scans are common. I see them every day. If you don't have these files, just ignore the requests and let your server return a 403 File Not Found.

keyplyr, thanks, and I am reassured that what I see is not so unusual after all. It is a pity that our servers spend so much time with these things.

By the way, thanks for the link about blocking that you sent earlier. I am already blocking some bad bots and known UAs and will read the list again to see what else I can do. Thanks for that. Your answers and those from the others have been very helpful. This is a learning experience for sure. (Now, if only I could master regular expressions! But I am sure that is another whole topic.)
8:51 pm on July 29, 2017 (gmt 0)

Preferred Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts: 356
votes: 33


"Mozilla/5.0 Jorgee"

has been around for a year and commented on this forum [webmasterworld.com]! it has been noted by many people, attacking from Germany, and other locations.

Found a perl script with that name [exposedbotnets.com], no instructions. No owner on Github.
1:42 am on July 30, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


this means only that they are requesting the domain at a default URL (such as http://example.com/vulnerability-whatever ) which is being redirected to the correct syntax (such as https://www.example.com/vulnerability-whatever) which then causes the 404 response.

Thanks, not2easy, actually I have never seen a 404 response to these requests in my logs. Always 301. But as you say, the request is to http and is to being redirected to https. Would the fact that the requests come so fast prevent the server from sending a follow-up 404 response? I have seen the typical 105 requests come in as little as 12 seconds, almost 9 requests/sec (I have no experience with this).

Also, the requests are labeled HEAD instead of GET. (I haven't figured that out yet.)
1:55 am on July 30, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


"Mozilla/5.0 Jorgee" has been around for a year and commented on this forum [webmasterworld.com]! it has been noted by many people, attacking from Germany, and other locations.

Aha! Thanks, TorontoBoy. The discussion there mentions the DDOS potential and that might explain the high rate (to me, at least) of requests that I mentioned above. I haven't kept track of where they came from but I recall Germany, France, USA, Singapore, Australia, etc.
2:29 am on July 30, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11758
votes: 738


It's not the speed preventing a second request after the 301, it's likely the script is just not written to follow redirects.

It's a very small script (a couple lines of code) following links from site to site or from social media resources like Twitter.

Newer vulnerability checker scripts will likely follow HTTP to HTTPS redirects since this is common now.
3:26 am on July 30, 2017 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:3761
votes: 206


If the request is being redirected from HTTP to HTTPS the 404 might not be seen in the access logs for HTTP because there may be separate logs for HTTPS requests. I download my logs via ftp and there are two gzip files, one for HTTP and one for HTTPS.
3:33 am on July 30, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11758
votes: 738


If you're managing a site that is HTTPS, those are the logs you should be looking at :)
3:50 am on July 30, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:14780
votes: 635


I have never seen a 404 response to these requests in my logs. Always 301. But as you say, the request is to http and is to being redirected to https. Would the fact that the requests come so fast prevent the server from sending a follow-up 404 response?

No, because each request is an island, and a 3xx response means just that: "Make a new request". The server doesn't know that the same entity put in a request two milliseconds ago. If something comes in too fast to be handled, you'll see a 500-class response, but it will still be logged. Now, some servers can get a little hiccupy about logging rapid requests: my logs, for example, very often show a bunch of supporting files before the main HTML request. Going by timestamps, things can get up to several seconds out of whack. But in the specific case of http vs https-- as opposed to with/without www-- those will be separate logs, so it's harder to compare them.

Another possibility is that the robot is too primitive to do https at all. Unwanted robots do not necessarily use cutting-edge technology. You don't catch me complaining. (I just checked. HTTP/1.0 can get to https sites. I wasn't sure.)
5:52 pm on July 30, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


It's not the speed preventing a second request after the 301, it's likely the script is just not written to follow redirects. ... Newer vulnerability checker scripts will likely follow HTTP to HTTPS redirects since this is common now.

Thanks, keyplyr, good point. One more thing I should watch for in the future. ;-)
6:03 pm on July 30, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


not2easy wrote :
If the request is being redirected from HTTP to HTTPS the 404 might not be seen in the access logs for HTTP because there may be separate logs for HTTPS requests. I download my logs via ftp and there are two gzip files, one for HTTP and one for HTTPS.

No, the 404 response isn't in the HTTPS logs either. Curious. I now download all three logs (HTTP, HTTPS and FTP) but confess that I didn't used to when I started. As everyone points out, there is a lot to learn from the logs!
6:14 pm on July 30, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


If you're managing a site that is HTTPS, those are the logs you should be looking at :)

Hi, keyplyr, and you are right, of course. My original query was motivated by the curious (to me) uniformity of the repeated hits on my site by people using the "Jorgee" script. None of them seemed to try anything different, not on my site at least. And I don't/didn't have enough experience to know whether this was normal and to be expected or not. As I mentioned above, I am looking at all three access logs. As long as there is something (for me) to learn from each of them.
6:22 pm on July 30, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11758
votes: 738


It's because it's just a simple script to check for the existence of these files and whether there's access. The real hack/injection would come later, likely from a different IP address and UA.

It's good that you're reading your logs Martin Potter. That's the best way to learn.
6:24 pm on July 30, 2017 (gmt 0)

New User from CA 

joined:July 9, 2017
posts:39
votes: 2


No, because each request is an island, and a 3xx response means just that: "Make a new request". The server doesn't know that the same entity put in a request two milliseconds ago. If something comes in too fast to be handled, you'll see a 500-class response, but it will still be logged. Now, some servers can get a little hiccupy about logging rapid requests: my logs, for example, very often show a bunch of supporting files before the main HTML request. Going by timestamps, things can get up to several seconds out of whack.

Thanks, lucy24, for mentioning all that. I will try to keep it in mind as I go through the logs. I recall having seen a few 500 responses in the past (not "Jorgee" related, I think) but I didn't understand what caused them. There is so much to learn here.
By the way I have been reading some of the threads in the Apache Server forum. I just wish my aging brain were big enough to hold all of this stuff!
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members