Eigenvalues for a product of matrices

Hint For invertible $B$, the characteristic polynomial of $AB$ is $$\det(\lambda I - AB) = \det[(\lambda B^{-1} - A) B] = \det (\lambda B^{-1} - A)\det B.$$ Rewrite this to show that $$\det(\lambda I - AB) = \det(\lambda I - BA).$$

Additional hint Now, both sides are evidently continuous functions of $B$, and the set of invertible $n \times n$ matrices is dense in the space of all $n \times n$ matrices.

By applying the above argument inductively, we can also conclude that the eigenvalues of $A_1 \cdots A_r$ coincide with those of any product $A_s \cdots A_r A_1 \cdots A_{s-1}$ given by permuting the factors cyclically. On the other hand, this is not true for general permutations, that is, in general the eigenvalues of $A_{\sigma(1)} \cdots A_{\sigma(r)}$ are not those of $A_1 \cdots A_r$. For a minimal example, take $$A_1 = \pmatrix{1&0\\0&0}, \qquad A_2 = \pmatrix{0&1\\1&0}, \qquad A_3 = \pmatrix{0&1\\0&0} .$$ Then, $A_1 A_2 A_3 = 0$ but $A_1 A_3 A_2 = \pmatrix{1&0\\0&0}$.

You can also show using these ingredients that the trace of an $r$-fold product of square matrices is invariant under cyclic permutation, but we can again see from the above example that the same is not in general true for general permutations.


In general $AB$ and $BA$ are similar matrices, as $AB=B^{-1}(BA)B$. Hence they represent the same linear transformation but with respect to different bases. Hence they have the same characteristic polynomial. Here "in general" means "provided not both $A$ and $B$ are singular".

The same sort of argument shows that in a group $ab$ and $ba$ have the same order.

So now you have an explanation "in general".

How can one "understand" the result in the special cases? If the matrices have entries from $\mathbb{C}$ we can use a continuity argument -- this is often a useful trick. Just note that the non-singular matrices are dense in the matrix ring, and then choose non-singular $A_n, B_n$ converging to $A,B$. Since the coefficients of the characteristic polynomial are polynomial functions of the matrix elements, they are continuous, so that as $n\to\infty$, $c_{A_nB_n}\to c_{AB}$, and $c_{A_nB_n}=c_{B_nA_n}\to c_{BA}$.

The result is still true over finite fields, but I don't know of a good intuitive reason, you've just got to hack it out.


On the spectra of cyclic permutation of matrix products

You can show that cyclic permutations of matrix produxts have the same spectra rather easily. Consider $$ A_1 \ldots A_n \vec v = \lambda \vec{v}.$$ Now multiply by $A_n$ on both sides. $$ A_n A_1 \ldots A_{n-1} (A_n \vec{v}) = \lambda (A_n \vec{v}).$$ So $ A_n A_1 \ldots A_{n-1}$ has the same eigenvalues as $ A_1 \ldots A_n$ with corresponding eigenvectors given by the above expressions. You can do this procedure repeatedly to show that all cyclic permutations have the same spectrum. The two matrix case is a special case of this.

Geometric interpretation

Sorry for the hand drawn picture. This gives an explanation for the case where the eigenvalue is $1$ or $-1$. The loci of vectors turned by the same amount due to a rotation matrix form a cone centred at the origin in 3D. When you combine two rotations the eigenvectors corresponding to $1$ lie in the intersection of two such cones, one for each rotation matrix. The picture shows such an intersection. The cones intersect at two different vectors $\vec v$ and $A\vec{v}$ in the picture. When you reverse the order in which the rotation are applied the eigenvector changes from one of these vectors to the other. I cannot think of any geometric interpretation for the case of complex eigenvalues.

enter image description here