Relation of this antisymmetric matrix $r = \left(\begin{smallmatrix}0 &1\\-1&0\end{smallmatrix}\right)$ to $i$

joriki's answer is really nice and to the point as usual, but let me add my two cents and add the relation to the matrix representation of complex numbers.

First notice that multiplication by $i$ corresponds to a (counterclockwise) $90^{\circ}$-rotation around the origin in the complex plane. Now we can consider $\mathbb{C}$ as a $2$-dimensional vector space over $\mathbb{R}$ with basis $1,i$ so that $z = a + bi = \begin{pmatrix}a\\b\end{pmatrix}$. Let me denote the $\mathbb{R}$-linear map $z \mapsto iz$ by $\mathbf{J}$. Note that for $z = a + bi = \begin{pmatrix}a\\b\end{pmatrix}$ we have $iz = ia -b = -b + ia = \begin{pmatrix}-b\\a\end{pmatrix}$, so we must have $\mathbf{J} = \begin{pmatrix}0&-1\\1&0\end{pmatrix}$. You can of course also see this by remembering that a rotation around the angle $\alpha$ has the matrix $\begin{pmatrix}\cos{\alpha}&-\sin{\alpha}\\\sin{\alpha}&\cos{\alpha}\end{pmatrix}$.

So far so good, but this is only the beginning of the story! Now clearly we have $\mathbf{J}^2 = - \mathbf{1}$, $\mathbf{J}^3 = -\mathbf{J}$ and $\mathbf{J}^4 = \mathbf{1}$, so $\mathbf{J}$ satisfies very similar properties as the ones we're used to from $i$...

Given this, it is natural to try and look at matrices of the form $a\mathbf{1} + b\mathbf{J} = \begin{pmatrix}a&-b\\b&a\end{pmatrix}$ (the antisymmetric real $2\times2$-matrices).

Since we're working in a vector space of matrices, addition behaves in exactly the same way as usual, so let us look at multiplication. You should convince yourself that matrix multiplication gives $(a\mathbf{1} + b\mathbf{J})(c\mathbf{1}+d\mathbf{J}) = (ac-bd)\mathbf{1} + (ad+bc)\mathbf{J}$ giving us back the multiplication rule for complex numbers from matrix multiplication. Also note that complex conjugation simply corresponds to transposition. Also, the determinant encodes the square of the absolute value, as you can check easily.

If editing were not so painfully slow at the moment, I'd have loved to elaborate further by plugging in complex values and ending up with the quaternions and the Pauli matrices but for the moment a simple wikipedia link will have to do. See in particular the passage on matrix representations of the quaternions.


The connection is due to the fact that this matrix has eigenvalues $\mathrm i$ and $-\mathrm i$. Since the eigenvalues of the square of a matrix are the squares of its eigenvalues, the eigenvalues of the square are both $-1$, and thus the square must be $-I$. The same is true for any square matrix of any dimension that has only eigenvalues $\pm\mathrm i$.


One of the first algebra courses I took as a student defined $i$ as the matrix $\left( \begin{array}{clcr} 0 & 1\\-1 & 0 \end{array} \right)$ and went on to define the complex numbers in terms of the appropriate $2 \times 2$ real matrices. I had seen complex numbers before, of course, with the usual $i^2 = -1$ definition, but I found the matrix definition much more satisfying- you didn't need to "invent" an element with $i^2 = -1$, you could see it with your own eyes, in a familiar context.

In a similar spirit, you might like to think about identifying the division algebra of quaternions with the ring of $2 \times 2$ complex matrices of the form $\left(\begin{array}{clcr} z & w\\ -\overline{w} & \overline{z} \end{array} \right)$. I find this much easier to remember than the definition of the quaternions as $ 4 \times 4$ real matrices.