Forum Moderators: phranque
The search results list links to a download script:
"/downloadfile.php?id=12345"
and the large files are sent through that download script with a simple fread/echo loop
Bandwidth is not a problem, but my server only has about 90mb ram so I've limited the number of processes started by apache to 10 or 15. But sometimes all processes are busy with transferring those large files so my site won't respond to any requests until they finish. Ouch!
So, I need to:
#1 Reduce the amount of memory consumed by apache, so that I can keep more processes running.
- Would I gain anything if I skipped the fread/echo loop and instead redirected the user to a static file? I figured maybe php quickly echoes everything to memory and then lets apache stream it from there... if so, then I guess static files would consume less memory.
- Would worker be better than prefork? I understand the difference but I'm not sure how it affects performance in this case.
#2 Limit the number of connections per IP, but only on those large files. It won't make any difference to clients as long as they queue up the requests, but it would definately free up some ram on my server. How would I do that?
Thanks for any help,
Martin
Of course a direct download-Link would be more efficient.
I'd prefer to avoid a ticket system, I'd rather have a prioritized request queue so that html/css requests always have higher priority than the 25mb downloads. Is there a mod for this? I haven't seen one... :\
-Martin