Centos – Too many files open issue (in CentOS)

centosfiles

Recently I ran into this issue in one of our production machines. The actual issue from PHP looked like this:

fopen(dberror_20110308.txt): failed to open stream: Too many open files

I am running LAMP stack along with memcache in this machine. I also run a couple of Java applications in this machine. While I did increase the limit on the number of files that can be opened to 10000 (from 1024), I would really like to know if there is an easy way to track this (# of files open at any moment) as a metric. I know lsof is a command which will list the file descriptors opened by processes. Wondering if there is any other better (in terms of report) way of tracking this using say, nagios.

Best Answer

You can have a look at /proc/sys/fs/file-nr

cat /proc/sys/fs/file-nr
3391    969     52427
|        |       |
|        |       |
|        |       maximum open file descriptors
|        total free allocated file descriptors
total allocated file descriptors

Total allocated file descriptors means the number of file descriptors allocated since boot. This can be thought of as a high water mark of max files open at one time. As these free up they go into the 2nd colmn, so the number of open files at any given time will be column 1 - column 2.