git says "fatal: confused by unstable object source data"
There is another possibility, even if remote. That would be a really big file (e.g 3 or more gb) putting it simply, git is unable to handle it. We found that error trying to create a repository in a structure with huge files
Either
- one or more files are being modified during your operation, or
- something is causing inconsistent reads (e.g. failing hardware).
Short version: Git’s developers did not intend for it to be used on volatile files.
Due to the layout* that Git uses for “loose objects” and the limited filesystem semantics that it assumes**, Git must know the first byte (two hex characters) of the object name (SHA-1) of a new object before it can start storing that object.
*
The objects/[0-9a-f][0-9a-f]/
directories. See gitrepository-layout
.
**
Specifically, it needs to be able to do “atomic” file renames. Certain filesystems (usually network filesystems; specifically AFS, I believe) only guarantee rename atomicity when the source and the destination of a rename are inside the same directory.
Currently, Git does two SHA-1 passes over each new file. The first pass is used to check whether whether Git already knows about the contents of the file (whether its SHA-1 object name already exists in the object store). If the object already exists, the second pass is not made.
For new contents (object was not already in the object store), the file is read a second time while compressing and computing the SHA-1 of the data being compressed. The compressed data is written to a temporary file that is only renamed to its final loose object name if the initial SHA-1 (“already stored?” check) matches the later SHA-1 (hash of the data that was compressed and written). If these SHA-1 hashes do not match, then Git shows the error message you are seeing and aborts. This error checking was added in 748af44c63 which was first released in Git 1.7.0.2.