Intuitively, what is the difference between Eigendecomposition and Singular Value Decomposition?
Consider the eigendecomposition $A=P D P^{-1}$ and SVD $A=U \Sigma V^*$. Some key differences are as follows,
- The vectors in the eigendecomposition matrix $P$ are not necessarily orthogonal, so the change of basis isn't a simple rotation. On the other hand, the vectors in the matrices $U$ and $V$ in the SVD are orthonormal, so they do represent rotations (and possibly flips).
- In the SVD, the nondiagonal matrices $U$ and $V$ are not necessairily the inverse of one another. They are usually not related to each other at all. In the eigendecomposition the nondiagonal matrices $P$ and $P^{-1}$ are inverses of each other.
- In the SVD the entries in the diagonal matrix $\Sigma$ are all real and nonnegative. In the eigendecomposition, the entries of $D$ can be any complex number - negative, positive, imaginary, whatever.
- The SVD always exists for any sort of rectangular or square matrix, whereas the eigendecomposition can only exists for square matrices, and even among square matrices sometimes it doesn't exist.
I encourage you to see an $(m \times n)$ real-valued matrix $A$ as a bilinear operator between two spaces; intuitively, one space lies to the left ($R^m$) and the other ($R^n$) to the right of $A$. "Bilinear" simply means that $A$ is linear in both directions (left to right or right to left). The operations $A$ can perform are limited to scaling, rotation, and reflection, and combinations of these; any other kind of operation is non-linear.
$A$ transforms vectors between the two spaces via multiplication:
$x^T$ A = $y^T$ transforms left vector $x$ to right vector $y$.
$x = A y$ transforms right vector $y$ to left vector $x$.
The point of decompositions of $A$ is to identify, or highlight, aspects of the action of $A$ as an operator. The eigendecomposition of $A$ clarifies what $A$ does by finding the eigenvalues and eigenvectors that satisfy the constraint
$A x = \lambda x$.
This constraint identifies vectors (directions) $x$ that are not rotated by $A$, and the scalars $\lambda$ associated with each of those directions.
The problem with eigendecomposition is that when the matrix isn't square, the left and right space are spaces of vectors of different sizes and therefore completely different spaces; there really isn't a sense in which $A$'s action can be described as involving a "rotation", because the left and right spaces are not "oriented" relative to one another. There just isn't a way to generalize the notion of an eigendecomposition to a non-square matrix $A$.
Singular vectors provide a different way to identify vectors for which the action of $A$ is simple; one that does generalize to the case where the left and right spaces are different. A corresponding pair of singular vectors have a scalar $\sigma$ for which $A$ scales by the same amount, whether transforming from the left space to the right space or vice-versa:
$ x^T A = \sigma y^T$
$\sigma x = A y$.
Thus, eigendecomposition represents $A$ in terms of how it scales vectors it doesn't rotate, while singular value decomposition represents $A$ in terms of corresponding vectors that are scaled the same, whether moving from the left to the right space or vice-versa. When the left and right space are the same (i.e. when $A$ is square), singular value decomposition represents $A$ in terms of how it rotates and reflects vectors that $A$ and $A^T$ scale by the same amount.
Intuitively, $SVD$ says for any linear map, there is an orthonormal frame in the domain such that it is first mapped to a different orthonormal frame in the image space, and then the values are scaled.
Eigendecomposition says that there is a basis, it doesn't have to be orthonormal, such that when the matrix is applied, this basis is simply scaled. That is assuming you have $n$ linearly independent eigenvectors of course. In some cases your eigenspaces may have the linear map behave more like upper triangular matrices.
Edit: Consider the difference for a rotation matrix in $\mathbb{R}^2$.
Here, there are no real eigenvalues and this corresponds to there being no choice of basis which under the transformation is simply a scaling. On the other hand, SVD makes a lot of sense here because it says we can take the standard basis in the domain, map it to the rotated version of this basis (thought of as in the image space), and scale everything by 1.