|Larger the file, slower the serving speed|
| 4:03 pm on Apr 27, 2010 (gmt 0)|
I first started this thread on Apache forum, but thinking my problem exists not only on Apache but also on FTP downloads, I thought this may be an operating system/configuration issue. Any help will be appreciated. Here is the problem:
I noticed that the bigger a file, the slower Apache serves it on my CentOS machine. This is the same on FTP downloads.
200 MB file: speed 30KB/s
100 MB file: speed 50KB/s
30 MB file: speed 150KB/s
1 MB file: speed 450+ KB/s
I am on a dedicated server which belongs to myself, so I can see the server stats, there are no problems on the server end.
- I am on a 100 MBit port, no restrictions, no topping on bandwidth.
- Looging into the server via SSH and then "top" shows a maximum of 0.05 load. No load on server.
- I even tried an OCZ Solid State Hard Drive. No changes.
- I have an 8Mbit internet connection at home. When I FTP and try to download the same video, the speed is the same around 50KB/s and never go up.
- If I download using 8-10 threads (my FTP software has a multi-thread downloading feature, which can download up to 20 files at the same time), then my download capacity is maxed, like 700-800KB/s or similar.
I have CentOS 5.x on the server. Is the problem caused by Apache, or CentOS? Maybe it is a precaution for restricting a single user not topping up the server's bandwidth. There must be some configuration for this, but where? Any help will be appreciated.
| 5:59 am on Apr 28, 2010 (gmt 0)|
Assuming you have access with SSH to this dedicated server, you could first use the Apache profiler to see if the problem is in the network or on you server. The command:
gives statistics about the load speed of a large file.
If you get high speeds with this command, the problem is somewhere in the network. If you have also low speeds with the ab command, the bandwidth throttling is on the server. The ab command has several options to set the number of concurrent requests etc.
Another relatively unknown tool on Linux is tc. It is the traffic controller and can set rate limiting and packet filtering options in the kernel. I have used it only once to throttle download speeds on a server which was connected to a slow network to prevent it from saturating that network connection. You can give the following commands on the command line in an SSH shell:
tc qdisc show
tc class show
tc filter show
Only the first command should provide one line with setting information. If you get more output from these commands, then some traffic shaping has been enabled in the kernel.
| 8:31 am on Apr 28, 2010 (gmt 0)|
Hello lammert, thanks for the answer.
I checked a couple of files with 'ab' as you said, and the load time is between 0.4 secs and 4 secs, 0.4 for a file around 13MB, and 4 sec for a file around 220MB.
I got responses from 3-4 people, saying their download speed is low, between 25-70 KB/s. One guy in Poland said he had 800 KB/s on a 7 Mbit internet connection, but a girl had 25 KB/s.
I noticed something very weird half an hour ago. I made some tests on the same video file this time. I tried to download it 6-7 times with right click - save target as.. The download speed is roughly as follows:
400 KB/s .......
This is really weird, as while the speed is generally slow, it sometimes downloads at 400 KB/s speeds.. I checked and this is not related to resuming the previous download. I see this 400 or 400+ KB/s even the download begins from scratch.
tc qdisc show
tc class show
tc filter show
only the first one shows a line of data, the other 2 returns nothing.
| 11:00 am on Apr 28, 2010 (gmt 0)|
The ab tool tests the speed at which Apache is able to send out data. 220MB in 4 seconds is 440 megabit/sec. You could almost saturate a 1Gbit Ethernet network with that :) This rules out some internal problems in the Apache configuration. Empty traffic control tables is an indication that the problem is also not in the kernel. With people on other locations facing the same problem it is also sure that your personal Internet connection is not the problem.
This makes it probably a problem out of your control, maybe an overload situation in the data center or some deliberate rate limiting there. I once had a dedicated server in a not too expensive data center where they started throttling data once a server generated more traffic than a few GB per day. The throttling jumped in almost randomly and stayed on for a few hours, then was lifted again. I contacted the help desk and they said it was an automatic setup in one of their routers. They were able to remove the limitation on a per-client base though. You may ask your hosting provider if there is some rate limiting going on in their setup.
| 5:38 pm on Apr 28, 2010 (gmt 0)|
- 220MB in 4 seconds is 440 megabit/sec
Yes, I didn't look it that way, thank you. It makes 55MB/s x 8 = 440 megabit/sec exact.
Well, thanks for the heads up about the datacenter. I will contact them about the issue.