Forum Moderators: coopster
We have a file download site similar to download.com. We currenty store all the files outside of our webroot on the file system and use php to grab the file and send it to the user.
Recently we opened a new section where the file sizes are between 10-20mb (up from 100k files). This has almost killed our server load as each connection lasts almost 100x longer and the number of concurrent apache procs to handle them is way up.
Im wondering what the most effecient way is to serve files like this. Is there a way to load them into some server memory cache perhaps? Maybe theres an apache or php mod that more effeciently can do this? Anyone have any recommendations?
Thanks,
Will
or you could offload those particular files to another server and serve them from there leaving your primary server to do the other work.