How to compute the SVD of $2\times 2$ matrices?

There's a ridiculously easy method to apply the method you already know, starting from your matrix $\mathbf A$. Consider the QR decomposition

$$\mathbf A=\mathbf Q\mathbf R$$

where $\mathbf Q$ is orthogonal and $\mathbf R$ is upper triangular. You say you already know how to take the SVD of a triangular matrix:

$$\mathbf R=\mathbf W\mathbf \Sigma\mathbf V^\top$$

where both $\mathbf W$ and $\mathbf V^\top$ are orthogonal and $\mathbf \Sigma$ is diagonal. You know (or at least are supposed to know) that the product of two orthogonal matrices is again an orthogonal matrix, so letting $\mathbf U=\mathbf Q\mathbf W$, you should be able to obtain your desired singular value decomposition $\mathbf U\mathbf \Sigma\mathbf V^\top$.

In general, preprocessing your matrix with QR decomposition makes for an easier SVD computation. For a general $m\times n$ matrix, one can do a "thin QR decomposition" (with column pivoting if needed) where $\mathbf Q$ has the same dimensions as the original matrix, and $\mathbf R$ is square and triangular. One can then take the SVD of $\mathbf R$ and then multiply out the orthogonal matrices (and permutation matrices if you did pivoting; remember that permutation matrices are also orthogonal!) to then obtain the singular value decomposition of your original matrix.


FWIW, in the $2\times 2$ case, you're charmed, since you can use an appropriately constructed Givens rotation matrix for both the QR and SVD stages. The details should be in e.g. Golub and Van Loan's book.


The SVD of a $2\times 2$ matrix has a closed-form formula, which can be worked out by writing the rotation matrices in terms of a single unknown angle each, and then solving for those angles as well as the singular values.

It is worked out here, for instance.