Forum Moderators: phranque
Redhat ES 3.0 Apache 2.4
Can the maximum number of possible open files be icreased?
Here is part of my /etc/httpd/conf/httpd.conf:
<IfModule prefork.c>
ServerLimit 512
StartServers 5
MinSpareServers 5
MaxSpareServers 10
MaxClients 512
MaxRequestsPerChild 1000
</IfModule>
....
....
<IfModule worker.c>
StartServers 2
MaxClients 150
MinSpareThreads 25
MaxSpareThreads 75
ThreadsPerChild 128
MaxRequestsPerChild 0
------------------ 8< ---------------------
And here is part of my /etc/security/limits.conf file:
* soft nofile 2048
* hard nofile 2048
Any ideas!
Thanks in advance.
If you have static pages, how much traffic? Lots? Does the message only occur during times of high load? You need to do some tuning and raise those limits.
If you're running dynamic content you may be leaking file handles. Running out of files after a consistent amount of hits points in this direction, but if you're using mod_php or mod_perl, this is less likely.
You can use a tool like lsof ("list open files") to determine who is holding on to files. This may give you clues about why this is happening (for example, a bunch of temporary files).
Allowing more file descriptors will have negative impacts on other areas of the system. You will need to read your OS and Apache's tuning documents to figure out how best to balance these.
Now on Solaris it would be the /etc/system file which would need to be updated but I don't know what to set on Red Hat Linux.
/etc/security/limits.conf?
/sbin/sysctl?
ulimit -n 2048
(increase from 1024)
works but only for a root user session.
Thanks for any help you can provide.