Welcome to WebmasterWorld Guest from 54.198.164.83

Forum Moderators: Ocean10000 & incrediBILL & phranque

Server experiencing Ram Spikes

Could this be a server attack?

     
12:09 pm on Jan 8, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


First off, I have limited knowledge of Apache, so I cam to this form looking for help.

I work for a small religious non-profit. Our site has been around since 1996 and consists of 1,000s of HTML pages. Our server is hosted on a dedicated server on DreamHost. Over the last couple of months we have been experiencing RAM spikes. The RAM usage will be steady at 300 to 500 MB and then it will jump up to use almost all 4GB. (I should mention that CPU usage will go up to 25% at the same time.) This make our shopping cart (our main source of revenue) unusable. We have Cloudflare setup and when we turn on DDoS protection, the RAM usage will drop back to normal. After a while we turn off DDoS protection because it affect how some users use the website. Eventually, RAM usage will spike again and we turn DDoS protection back on.

I should mention too that two years ago we switched from an IIS server to an Apache server and encountered a lot of errors due to Linux's case sensitivity. As a result, I converted whole directories of files to lower case and used 301 redirects to convert the URLs.

I have two question? Is our website under attack? Could the large number of 301 redirects cause the RAM spike?
12:23 pm on Jan 8, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11135
votes: 662


Hello JohnBlood & welcome to WebmasterWorld [webmasterworld.com]

Likely bots and possibly botnets. You may be on their radar for some reason. You can proactively block these bad actors from access to your server resources.

I highly recommend manually examining your server access and error files several times a day. Download them to your local machine and use a text editor to look through them. In time you will learn what to look for.

Here are a few helpful links:

Search Engine Spider & User Agent ID Forum [webmasterworld.com]

Server Farm IP Ranges [webmasterworld.com]

Blocking Methods [webmasterworld.com]

This takes time to learn what works & there is no quick fix. The file Cloufare uses to block bad bots will actually cause quite a bit of collateral damage (it will block humans too) so I recommend not using it.

There is a huge amount of valuable information in these forums. We invite you to do some reading and participate further in other discussions.
12:41 pm on Jan 8, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


This is a common problem, then? We did have any problems for the first year or so.
12:45 pm on Jan 8, 2018 (gmt 0)

Full Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts: 263
votes: 20


Each site is different and will attract different set of bots. This is pretty random. Using a ban list from someone else will have rules that will not apply to your site, thereby slowing down your server. It is best to build your own custom ban list, from your own data.

You will need to learn to read your raw access log, available from cPanel, to find out if any single IPs are really hammering your site. You can then find out who they are, if you want them to behave that way, and if not, then ban them using your .htaccess. For example do you want Russian, Chinese, Indian, Indonesian,Turkish IPs hitting your site? The raw access log will also show you if someone/bot is trying to break into your site, whereby you can ban them.
12:47 pm on Jan 8, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11135
votes: 662


It is my opinion that a website today absolutely needs proactive defensive measures, especially if you are commerce.

Most of Your Traffic is Not Human [webmasterworld.com]
1:12 pm on Jan 8, 2018 (gmt 0)

Full Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts: 263
votes: 20


I do agree. Every site will get probed, someone will try to hack into it, or mercilessly scrape it. You need to be proactive and expect this. They will come, no exceptions.

Hacking and site probing software is very easy to obtain, at no cost, and can be run from any Linux desktop, anywhere in the world. Open source. The barriers to entry are very low.
6:24 pm on Jan 8, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


Is there a service like Serci that I can sign up to protect my server? I'm asking because working on the server is just one of my jobs and we can't afford to hire anyone right now.
7:37 pm on Jan 8, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11135
votes: 662


As I said, there is no quick fix. There is a compromise with any *service.* Again,
when you use a filter created by a 3rd party, it will not be specific to your needs.

Example: Most bad bots, the ones that hog server resources and give no benefit in return, come from server farms. So blocking server farm IP ranges is a logical defensive action.

However, there are some bots that may be extremely helpful to your specific interests that also come from these IP ranges, so you need to create an exception rule on your server to allow those beneficial agents access while blocking others.

Read the links I provided above. There is a lot to learn.
9:16 pm on Jan 8, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:14608
votes: 598


You said at the outset that youíre on a dedicated server. Once youíve done the research--site-specific, so allow for a few months--you may want to look into a firewall. When there is absolutely zero possibility that a given IP range will be welcome on your site, it can be appropriate to block them before they even reach the server.

Denying access (403) in and of itself wonít affect server load a whole lot, because the vast majority of malign visitors come in with a shopping list of URLs and will work through that shopping list from top to bottom even if every last request receives a 403. When the site is made up of hand-rolled HTML pages, there's not that much difference between serving the requested page and serving the 403 page. Not much difference to the server, that is. It can make a heck of a difference to you, the human.
5:48 pm on Jan 9, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


What is the best way to determine if an IP is malicious? Our hosting company (Dreamhost) suggested I use [ip-tracker.org...] but that just tells the source of the IP address. I assume that I can block the address via htaccess? What firewall would you recommend?
6:17 pm on Jan 9, 2018 (gmt 0)

Full Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts: 263
votes: 20


Look into your raw access log. If you see an IP that is trying to repeatedly break into your system, repeatedly trying to log in, that is malicious. If you see an IP that is looking for many web pages that are not found, then it could be looking for vulnerabilities for software you have not installed. Look at the GET requests. Are IPs trying to post to your web site? malicious IPs can come from any country, large and small.

Are you using a CMS such as Wordpress? Wordpress gets a lot of hacking attempts.
6:24 pm on Jan 9, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


We have thousands of hand-coded HTML pages. We do have an install of Wordpress to handle our podcast. DreamHost keeps WP up-to-date. We are a conservative Christian site.
6:46 pm on Jan 9, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:14608
votes: 598


Any request for any WordPress file is up to no good--excepting, of course, requests from yourself and authorized administrators, generally identifiable by IP. Humans donít request files in /wp; they request normal visible URLs. If you don't happen to have WordPress this is a minor issue: the server looks for the requested file, doesnít find it, returns a 404 and there's an end on't. But if you do have WP, each of those requests leads to massive server activity as the whole WordPress install creaks into life. So this could well be part of the RAM spikes youíre noticing.

Again, your first step is to spend some time with raw access logs. Look in particular at any and all requests for WordPress-related files. In addition to blocking the major offenders, you might want to intercept malign requests from unknown sources. For example, with a RewriteRule following this structure:
RewriteCond %{REMOTE_ADDR} !^{list of authorized IP addresses here, beginning with your own}
RewriteCond %{REQUEST_URI} !{list the URIs used by your podcast}
RewriteRule ^wp - [F]
Even the second Condition may not be needed, depending on your URI structure. There's a WordPress forum just next door if it turns out you need specific pointers. The idea here is to block requests before they reach WordPress.

Weíre not really supposed to talk about specific hosts. But with yours, the default time that raw access logs are kept on the server is three days. Go into your control panel (Domains >> Site Statistics) and change it to a longer period; I use 15 days. That way, you donít lose your logs just because you didnít stop by to download them every other day.
7:49 pm on Jan 9, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


So, I've been looking at the web stats from Dreamhost (analog 6.0) and it looks like my install of Piwik (which we don't really use) has received 13,000 requests in the last week. In that same time period, the root directory received 12,000 requests and our OpenCart install over 5,000. Also, we received several thousand requests from an Amazon AWS IP address. Over 134,000 requests were from the Safari browser.

Thoughts?
8:02 pm on Jan 9, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11135
votes: 662


Thoughts?
You wouldn't be asking this question if you had read the links I gave you above.

You have a lot of reading and learning to do.
9:16 pm on Jan 9, 2018 (gmt 0)

New User

joined:Jan 8, 2018
posts: 6
votes: 0


The links that you posted are not helpful in and of themselves. I don't know what to do with pages of IP addresses of server farms and short descriptions of blocking methods. Don't have a lot of time to dedicate to this. I'm asking for advice for professionals.

You recommended against using CloudFlare DDoS protection, but without it our website becomes useless. I'm trying to put fires out as best I can.
9:27 pm on Jan 9, 2018 (gmt 0)

Full Member from CA 

Top Contributors Of The Month

joined:Feb 7, 2017
posts: 263
votes: 20


Web stats will not give you enough detail to determine malicious behaviour. For example AWS is a cesspool of bad bot behavior, but the Wayback Machine also uses it. Look for raw access log and learn how to read it.
9:37 pm on Jan 9, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:11135
votes: 662


Don't have a lot of time to dedicate to this
Well then nothing will change. It takes a lot of time to run a successful website, of any type. If you do not protect your property, it will be exploited, hacked, stolen, plagiarized, etc...

Do not rely on Dreamhost's report software (Analog or Piwik) because it is set up generically and does not give you the specifics you need.

As I mentioned above, download your raw server logs (both the access log and the error log) several times per day and use a text editor to manually search around.

You can get some limited info from Analog or Piwik. Example: when you see a particular IP address making a large number of requests, cut'n paste that IP address into your browser, open your access log and use the Search or Find feature of the text editor and see exactly what the IP was requesting.

Use [centralops.net...] or [ip-tracker.org...] to see who this IP address belongs to. Search at WW to see what others have reported about that company. If you feel this actor is doing things you do not want done at your site (scraping, injecting scripts, searching for vulnerabilities, etc) then block that IP address... better yet block the entire range.

However, as explained in this link I gave you above: [webmasterworld.com...] the range may also have beneficial agents that you do not wish to block.

Then you may have questions about how to block a range but allow some User Agents through. This is the forum where you ask those types of questions.

If you really feel that looking at a site traffic report generated by software is the best for you, then get your own copy of Analog [mirror.reverse.net] and put it on your local machine. It is highly customizable. Tweak it to show exactly what you need.