Proofs of Determinants of Block matrices

Proof of the 3rd identity.

It is a consequence of the following "block diagonalization" identity:

$$\pmatrix{ A&B\\ C&D }=\pmatrix{ I&0\\ CA^{-1}&I }\pmatrix{ A&0\\ 0&S }\pmatrix{ I&A^{-1}B\\ 0&I } \ \ \text{with} \ \ S:=D-CA^{-1}B$$

(S = "Schur's complement" (https://en.wikipedia.org/wiki/Schur_complement)),

Then, it suffices to take determinants on both sides.

Remark : For many matrix formulas, take a look at the amazing compendium : "Matrix Mathematics: Theory, Facts, and Formulas" Second Edition by Dennis S. Bernstein (Princeton University Press, 2009).


To prove $(1)$, it suffices to note that $$ \pmatrix{A &B\\0&D} = \pmatrix{A & 0\\0 & D} \pmatrix{I&A^{-1}B\\0 & I} $$ From here, it suffices to note that the second matrix is upper-triangular, and to compute the determinant of the first matrix. It is easy to see that the determinant of the first matrix should be $\det(A)\det(D)$ if we use the Leibniz expansion.

For an example where $(2)$ fails to hold, consider the matrix $$ \pmatrix{ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&1\\ 1&0&0&0 } = \pmatrix{B&B^T\\B^T&B} $$ For an example where the diagonal blocks are invertible, add $I$ to the whole matrix.


Suppose block $A$ has dimension $r$, block $D$ has dimension $s$. Use the definition of the determinant $\lvert c_{i,j}\rvert,\enspace {1\le i,j\le r+s}$: $$\begin{vmatrix} A&C\\0& D\end{vmatrix} =\sum_{1\le j\le r+s}(-1)^{\text{sgn}\, \sigma}c_{\sigma(j),j}.$$ Now the non-zero terms are those such that, if $1\le j\le r$, $\;1\le \sigma(j)\le r$. Similarly, if $r+1\le j\le r+s$, $\;r+1\le \sigma(j)\le r+s$. So the non-zero terms are those for which the permutation $\sigma\in \mathfrak S_{r+s}$ is the concatenation of a permutation of $\mathfrak S_r$ and a permutation in $\mathfrak S_s$, and clearly the signature of $\sigma$ is the product of the signatures of its factors.

The formula for the first formula for the determinant follows by distributivity.