Why does git fail on push/fetch with "Too many open files"
There are two similar error messages:
EMFILE: Too many open files ENFILE: Too many open files in system
It looks like you're getting EMFILE
, which means that the number of files for an individual process is being exceeded. So, checking whether vi
can open files is irrelevant—vi
will use its own, separate file table. Check your limits with:
$ ulimit -n 1024
So on my system, there is a limit of 1024 open files in a single process. You shouldn't need to ask your system administrator (please don't use the acronym SA, it's too opaque; if you must abbreviate, use "sysadmin") to raise the limit.
You may wish to check which files Git opens by running Git under strace
.
This could be a bug in Git or in a library, or it could be you're using an old version of something, or it could be something more bizarre. Try strace
first to see which files it opens, and check whether Git closes those files.
Update from Hazok:
After using the above recommendations, it turns out the error was caused by too many loose objects. There were too many loose objects because git gc
wasn't being run often enough.
Why did this happen?
From the git documentation:
When there are approximately more than this many loose objects in the repository, git gc --auto will pack them. Some Porcelain commands use this command to perform a light-weight garbage collection from time to time. The default value is 6700.
Here "Some Porcelain commands" includes git push
, git fetch
etc. So if the max open files limit ulimit -n
< 6700, you'll be eventually blocked by git gc --auto
once you got ~6700 loose objects in a single git repo.
I'm in a hurry. How to fix it?
If you have sufficient permissions to adjust the system ulimit:
$ sudo ulimit -n 8192
Otherwise, you may disable git gc
by setting git config gc.auto 0
, so that you could push your local commits to the remote, delete the repo, and clone it back without thousands of loose objects.
How can we prevent this from happening again?
Set git config --global gc.auto 200
, where 200 is some value less than your max open files limit. If you picked a too small value, git gc
would run too frequently, so choose wisely.
If you set gc.auto=0
, the loose objects will never be packed unless you run git gc
manually. So there could be hundreds of thousands of files accumulated in the same directory, which might be a problem, especially for mechanical hard drive or Windows users. (See also: How many files in a directory is too many? and Is it OK (performance-wise) to have hundreds or thousands of files in the same Linux directory?).