Geometric interpretation of $\det(A^T) = \det(A)$

A geometric interpretation in four intuitive steps....

The Determinant is the Volume Change Factor

Think of the matrix as a geometric transformation, mapping points (column vectors) to points: $x \mapsto Mx$. The determinant $\mbox{det}(M)$ gives the factor by which volumes change under this mapping.

For example, in the question you define the determinant as the volume of the parallelepiped whose edges are given by the matrix columns. This is exactly what the unit cube maps to, so again, the determinant is the factor by which the volume changes.

A Matrix Maps a Sphere to an Ellipsoid

Being a linear transformation, a matrix maps a sphere to an ellipsoid. The singular value decomposition makes this especially clear.

If you consider the principal axes of the ellipsoid (and their preimage in the sphere), the singular value decomposition expresses the matrix as a product of (1) a rotation that aligns the principal axes with the coordinate axes, (2) scalings in the coordinate axis directions to obtain the ellipsoidal shape, and (3) another rotation into the final position.

The Transpose Inverts the Rotation but Keeps the Scaling

The transpose of the matrix is very closely related, since the transpose of a product is the reversed product of the transposes, and the transpose of a rotation is its inverse. In this case, we see that the transpose is given by the inverse of rotation (3), the same scaling (2), and finally the inverse of rotation (1).

(This is almost the same as the inverse of the matrix, except the inverse naturally uses the inverse of the original scaling (2).)

The Transpose has the Same Determinant

Anyway, the rotations don't change the volume -- only the scaling step (2) changes the volume. Since this step is exactly the same for $M$ and $M^\top$, the determinants are the same.


This is more-or-less a reformulation of Matt's answer. He relies on the existence of the SVD-decomposition, I show that $\det(A)=\det(A^T)$ can be stated in a little different way.

Every square matrix can be represented as the product of an orthogonal matrix (representing an isometry) and an upper triangular matrix (QR decomposition)- where the determinant of an upper (or lower) triangular matrix is just the product of the elements along the diagonal (that stay in their place under transposition), so, by the Binet formula, $A=QR$ gives: $$\det(A^T)=\det(R^T Q^T)=\det(R)\det(Q^T)=\det(R)\det(Q^{-1}),$$ $$\det(A^T)=\frac{\det{R}}{\det{Q}}=\det(Q)\det(R)=\det(QR)=\det(A),$$ where we used that the transpose of an orthogonal matrix is its inverse, and the determinant of an orthogonal matrix belongs to $\{-1,1\}$ - since an orthogonal matrix represents an isometry.


You can also consider that $(*)$ the determinant of a matrix is preserved under Gauss-row-moves (replacing a row with the sum of that row with a linear combination of the others) and Gauss-column-moves, too, since the volume spanned by $(v_1,\ldots,v_n)$ is the same of the volume spanned by $(v_1+\alpha_2 v_2+\ldots,v_2,\ldots,v_n)$. By Gauss-row-moves you can put $A$ in upper triangular form $R$, then have $\det A=\prod R_{ii}.$ If you apply the same moves as column moves on $A^T$, you end with $R^T$ that is lower triangular and has the same determinant of $R$, obviously. So, in order to provide a "really geometric" proof that $\det(A)=\det(A^T)$, we only need to provide a "really geometric" interpretation of $(*)$. An intuition is that the volume of the parallelepiped originally spanned by the columns of $A$ is the same if we change, for istance, the base of our vector space by sending $(e_1,\ldots,e_n)$ into $(e_1,\ldots,e_{i-1},e_i+\alpha\, e_j,e_{i+1},\ldots,e_n)\,$ with $i\neq j$, since the geometric object is the same, and we are only changing its "description".


Since $\text{sign}(\sigma^{-1})=\text{\sign}(\sigma)$ and $\phi:S_n\to S_n,\sigma\mapsto\sigma^{-1}$ is a bijection, we have $\det(A)=\sum_{\sigma\in S_n}\text{sign}(\sigma)\prod_{i=1}^na_{i\sigma(i)}=\sum_{\sigma\in S_n}\text{sign}(\sigma^{-1})\prod_{i=1}^na_{\sigma^{-1}(i)i}=\sum_{\sigma\in S_n}\text{sign}(\sigma)\prod_{i=1}^na_{\sigma(i)i}=\det(A^t)$

Note: I'm not using the geometric definition, but I could only post this here since the question without the geometric requirement was flagged incorrectly as a duplicate of this problem.