Eigenvalues for $4\times 4$ matrix

$A$ has zero row sums. Therefore the all-one vector $\mathbf e$ is an eigenvector of $A$ for the eigenvalue $0$. Since $A$ is also real symmetric, $\mathbf e$ can be extended to an orthogonal eigenbasis $\{\mathbf u,\mathbf v,\mathbf w,\mathbf e\}$ of $A$. But this is also an eigenbasis of $A+\mathbf e\mathbf e^\top$. Hence the spectrum of $A$ is $\{a,b,c,0\}$ if and only if the spectrum of $A+\mathbf e\mathbf e^\top$ is $\{a,b,c,\|\mathbf e\|^2\}=\{a,b,c,4\}$. It is easy to see that four eigenvalues of $$ A+\mathbf e\mathbf e^\top=\pmatrix{3&0&0&1\\ 0&4&0&0\\ 0&0&4&0\\ 1&0&0&3} $$ are $2,4,4,4$. Therefore the eigenvalues of $A$ are $2,4,4,0$.


It helps to notice that this matrix comes from a graph. It is the so called Laplacian of a graph. Notice the position of $-1$'s. If you take four points and join the $i$th and $j$th if there is a $-1$ at position $(i,j)$ you get a graph. It's fairly symmetric: its edges are the sides of a square and one diagonal. Consider different ways to associate a number to the vertices of the square. From one such distribution of numbers, you can produce another distribution as follows: at each vertex, the new number will be (old number) x (degree of the vertex) - (sum of numbers associated to the vertices that are joined with this one).

It may happen that the new distribution of numbers is proportional to the old one. So the old distribution is an eigenvector, and the constant of proportionality is an eigenvalue.

Suppose that you start with a distribution with all numbers equal ( say $1$). Then the new distribution is $0$ everywhere. So we got an eigenvector with eigenvalue $0$

Say we place $1$ at the ends of the (joined ) diagonal, and $-1$ at the other two points. Check that we get an eigenvector for eigenvalue $4$.

enter image description here

Say we place $1$, $-1$ at the diagonal, $0$, $0$ at the others. Check that we get an eigenvector with eigenvalue $4$.

Say we place $0$ at the diagonal, $-1$, $1$ at the other points. Check that we get eigenvector with eigenvalue $2$.

Note that this is a small example where it can be done by hand.


  1. $\lambda=2$ is obviously an eigenvalue: look at the first and the last columns of $A-2I$.
  2. $\lambda=0$ is also en eigenvalue: sum up all the columns of $A$ (dependent?).
  3. Try to do similar trick for $\lambda=4$.
  4. The last eigenvalue: The sum of all eigenvalues is the trace, so if you have three of them, the last comes for free.(c) comment by Lord Shark the Unknown.

EDIT: If a calculation of eigenvalues is of your primal curiosity then you can do the following: \begin{align} \det(\lambda I-A)&=\begin{vmatrix}\lambda-2 & 1 & 1 & 0\\1 & \lambda-3 & 1 & 1\\1 & 1 & \lambda-3 & 1\\0 & 1 & 1 & \lambda-2\end{vmatrix}\stackrel{(1)}{=} \begin{vmatrix}\lambda & 1 & 1 & 0\\\lambda & \lambda-3 & 1 & 1\\\lambda & 1 & \lambda-3 & 1\\\lambda & 1 & 1 & \lambda-2\end{vmatrix}\\ &\stackrel{(2)}{=}\lambda\begin{vmatrix}1 & 1 & 1 & 0\\1 & \lambda-3 & 1 & 1\\1 & 1 & \lambda-3 & 1\\1 & 1 & 1 & \lambda-2\end{vmatrix}\stackrel{(3)}{=} \lambda\begin{vmatrix}1 & 1 & 1 & 0\\0 & \lambda-4 & 0 & 1\\0 & 0 & \lambda-4 & 1\\0 & 0 & 0 & \lambda-2\end{vmatrix}. \end{align} Explanations:

(1): add all columns to the first one,

(2): factor out $\lambda$ in the first column,

(3): eliminate in the first column.