Expressing a matrix as an expansion of its eigenvalues

The proof using $AU = U\Lambda$ is not tedious. Since the $U$ is orthogonal, you have $U^{-1} = U^T$, so $A = U \Lambda U^T$. Then $$Ax = U \Lambda U^T x = U \Lambda \begin{bmatrix} u_1^T x \\ \vdots \\ u_n^T x \end{bmatrix} = U \begin{bmatrix} \lambda_1 u_1^T x \\ \vdots \\ \lambda_n u_n^T x \end{bmatrix} = \sum_k (\lambda_k u_k^T x) u_k = \sum_k \lambda_k u_k u_k^T x = (\sum_k \lambda_k u_k u_k^T)x$$

Hence $A=\sum_k \lambda_k u_k u_k^T$.

Since $AU = U\Lambda$, inverting both sides gives $U^T A^{-1} = \Lambda^{-1} U^T$, and hence $A^{-1} = U\Lambda^{-1} U^T$. Applying the above result to $A^{-1}$, noting that $\Lambda^{-1}$ is just the diagonal matrix of the inverses of the diagonal elements of $\Lambda$, we have $A^{-1} = \sum_k \frac{1}{\lambda_k} u_k u_k^T$.

To address your other question, the same result holds for Hermitian matrices ($A^* = A$), with the proviso that the $U$ will be unitary rather than orthogonal (ie, may be complex).

A normal matrix ($A A^* = A^* A$) can also be expressed as above, except the eigenvalues may be complex (and eigenvectors, of course)

The matrix $\begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} $ is real, but not symmetric, but does not have a basis of eigenvectors (hence it cannot be expressed as above).

The matrix $\begin{bmatrix} 0 & i \\ i & 0 \end{bmatrix} $ is symmetric but not real (it is normal). It can be unitarily diagonalized, but the eigenvalues and eigenvectors are complex.