The identity cannot be a commutator in a Banach algebra?

There is a Theorem of Wielandt which asserts that if $A$ is any normed algebra, complete or not, we can't express $I = 1_{A}$ in the form $xy - yx$. The proof is given in Rudin's book, but it is so beautiful that I give it here. Suppose that $xy -yx = I$. I claim that $xy^{n} - y^{n}x = ny^{n-1}$ for all $n \in \mathbb{N}$. We have the case $n = 1.$ Suppose that $xy^k - y^kx = ky^{k-1}$ for some $k$. Then $$xy^{k+1} - y^{k+1}x = (xy^{k} - y^{k}x)y +y^{k}(xy-yx) =ky^{k-1}y +y^{k}.I = (k+1)y^{k},$$ so the claim is established by induction. Note that $y^n \neq 0$ for any $n$, since otherwise there is a smallest value of $n$ with $y^n = 0$, leading to $0 = xy^n - y^nx = ny^{n-1}$, contrary to the choice of $n$.

But now, for any $n$, we have $$n\|y^{n-1} \| = \|xy^{n} -y^{n}x\| \leq 2\|x\|. \|y\| . \|y^{n-1} \| .$$ Since $y^{n-1} \neq 0$, as remarked above, we have $ 2 \|x\| . \|y\| \geq n$, a contradiction, as $n$ is arbitrary.


Here's a sketch of a proof. Let $\sigma(x)$ denote the spectrum of $x$. Then $\sigma(xy)\cup\{0\} = \sigma(yx)\cup\{0\}$. On the other hand, $\sigma(1+yx)=1+\sigma(yx)$. If $xy=1+yx$, then the previous two sentences, along with the fact that the spectrum of each element of a Banach algebra is nonempty, imply that $\sigma(xy)$ is unbounded. But every element of a Banach algebra has bounded spectrum.

(I don't remember where I first learned this proof, nor do I have a reference for it off-hand, but I did not come up with it myself.)

The proof that $\sigma(xy)\cup\{0\}=\sigma(yx)\cup\{0\}$ reduces to showing that $1-xy$ is invertible if and only if $1-yx$ is invertible, a problem that was the subject of a MathOverflow question.


There's a proof using derivations in section 2.2 of Sakai's book, Operator algebras in dynamical systems: the theory of unbounded derivations in $C^*$-algebras. A bounded derivation on a Banach algebra $A$ is a bounded linear map $\delta$ on $A$ such that $\delta(ab)=\delta(a)b+a\delta(b)$ for all $a$ and $b$ in $A$. Theorem 2.2.1 on page 18 shows that if $\delta$ is a bounded derivation on $A$, and if $a$ is an element of $A$ such that $\delta^2(a)=0$, then $\lim\limits_{n\to\infty}\|\delta(a)^n\|^{1/n}=0$. The proof uses induction with a neat computation to show that $\delta^2(a)=0$ implies that $n!\delta(a)^n=\delta^n(a^n)$, and then the result follows from boundedness of $\delta$ and the fact that $\lim\limits_{n\to\infty}\frac{1}{\sqrt[n]{n!}}=0$.

Corollary 2.2.2 concludes that the identity is not a commutator. If $ab-ba=1$, then the bounded derivation $\delta_a:A\to A$ defined by $\delta_a(x)=ax-xa$ satisfies $\delta_a^2(b)=\delta_a(1)=0$. By the preceding theorem, this implies that $1=\lim\limits_{n\to\infty}\|1^n\|^{1/n}=\lim\limits_{n\to\infty}\|\delta_a(b)^n\|^{1/n}=0$.

(Completeness is not used in this approach. An element $x$ of $A$ satisfies $\lim\limits_{n\to\infty}\|x^n\|^{1/n}=0$ if and only if $\sigma(x)=\{0\}$, and such an $x$ is called a generalized nilpotent. Incidentally, this also gives an approach to answering the example problem in the MathOverflow question Linear Algebra Problems? The remainder of Section 2.2 has a number of interesting results on bounded derivations and commutators of bounded operators.)