While working with a file processing server, I had a webpage execute a pdfunite command to join 1,100 PDF files into a single large PDF (it was invoked via curl from a cron job). The script failed, and I found that it was displaying “Too many open files” in its processing. It turns out that running pdfunite with so many files was exceeding the maximum open file count for Linux.
I increased the per-user limits by editing the /etc/security/limits.conf file and adding the following to the end of the file:
* hard nofile 500000
* soft nofile 500000
root hard nofile 500000
root soft nofile 500000
I rebooted, because I could, and then verified the maximum open file count using the ulimit command:
#su - apache -c 'ulimit -aHS' -s '/bin/bash' | grep 'open files'
open files (-n) 500000
Attempting to submit the files for processing again worked without a problem.