If $A$ is an $n \times n$ matrix such that $A^2=0$, is $A+I_{n}$ invertible?

The minus sign is not an obstacle: If $AB = -I$, then $A(-B) = -(AB) = -(-I) = I$. So in fact, if $A^2 = 0$, then $(A+I)(I-A) = A - A^2 + I - A = I$, so $A+I$ is invertible, as your first professor noted.

The error in the second argument is the following: It is true that if $B\mathbf{x}=\mathbf{0}$ has a nontrivial solution, then $CB\mathbf{x}=\mathbf{0}$ has a nontrivial solution. Thus, if $B$ is not invertible, then $CB$ is not invertible. But that is not what was argued. What was argued instead was that since $CB\mathbf{x}=\mathbf{0}$ has a nontrivial solution, then it follows that $B\mathbf{x}=\mathbf{0}$ has a nontrivial solution (with $B=A+I$ and $C=A$). This argument is incorrect: you can always take $C=0$, and that would mean that no matrix is invertible.

It is certainly true that if $A$ is not invertible, then no multiple of $A$ is invertible (so for every $C$, neither $CA$ nor $AC$ are invertible); so you can deduce that $A(A+I)$ is not invertible. This does not prove that $A+I$ is not invertible, however, which is what you wanted to show.

Now, for bonus points, show that if $A$ is an $n\times n$ matrix and $A^k=0$ for some positive integer $k$, then $A+\lambda I_n$ is invertible for any nonzero $\lambda$.

Added: For bonus bonus points, explain why the argument would break down if we replace $\lambda I_n$ with an arbitrary invertible matrix $B$.


For what it's worth I just want to mention that what is happening here is actually an instance of a more general result about rings. If $R$ is a ring then an element $a \in R$ is said to be nilpotent if there is $n \in \mathbb{N}$ such that $a^n = 0$.

In your question, the condition on your matrix that $A^2 = 0$ just means that it is nilpotent.

Now if $R$ is a ring with unity, then if $a \in R$ is nilpotent it can be proved that $1 - a$ is an invertible element (or unit) in the ring $R$, meaning that there is a $b \in R$ such that $(1 - a)b = b(1 - a) = 1$. From this you can then just change $a$ with $-a$ to also see that $1 + a$ is invertible.

I'm pretty sure that this is what Arturo had in mind by adding that exercise for bonus points for you, so I will not give the argument here. You can find it in this planetmath entry if you want to look at it. But I would suggest to you to first try it for yourself, for matrices at least.


I suggest thinking of the problem in terms of eigenvalues. Try proving the following:

If $A$ is an $n \times n$ matrix (over any field) which is nilpotent -- i.e., $A^k = 0$ for some positive integer $k$, then $-1$ is not an eigenvalue of $A$ (or equivalently, $1$ is not an eigenvalue of $-A$).

If you can prove this, you can prove a stronger statement and collect bonus points from Arturo Magidin.

(Added: Adrian's answer -- which appeared while I was writing mine -- is similar, and probably better: simpler and more general. But I claim it is always a good idea to keep eigenvalues in mind when thinking about matrices!)

Added: here's a hint for a solution that has nothing to do with eigenvalues (or, as Adrian rightly points out, really nothing to do with matrices either.) Recall the formula for the sum of an infinite geometric series:

$\frac{1}{1-x} = 1 + x + x^2 + \ldots + x^n + \ldots$

As written, this is an analytic statement, so issues of convergence must be considered. (For instance, if $x$ is a real number, we need $|x| < 1$.) But if it happens that some power of $x$ is equal zero, then so are all higher powers and the series is not infinite after all...With only a little work, one can make purely algebraic sense out of this.