Get Eigensystem to return orthogonal eigenvectors for Hermitian matrix

While the documentation does not specifically say that symbolic Hermitian matrices are not necessarily given orthonormal eigenbases, it does say

For approximate numerical matrices m, the eigenvectors are normalized.

For exact or symbolic matrices m, the eigenvectors are not normalized.

From this, it's reasonable to guess that if Mathematica isn't smart enough to normalize symbolic results, then it's also probably not smart enough to orthonormalize them, either.

Why is Mathematica not smart enough to do this? For exact input, the resulting eigenvectors can be complex numeric expressions, and calculating norms and other such arithmetic used in the Gram-Schmidt procedure becomes frightening from a computation-time perspective. As a result, Mathematica does not normalize symbolic eigenvectors because doing so could be catastrophic.

In contrast, with floating-point results, such performance considerations do not apply, and so by default it orthonormalizes them. Thus, if numeric results are acceptable, you can simply apply N to your array, and the results will automatically be orthonormal.

Otherwise, if numeric results are not acceptable, then you will have to manually orthogonalize the individual subspaces. Here is one method:

A = {{1, -2, 0, -2, 0, 0}, {-2, 1, 0, -2, 0, 0}, {0, 0, 1, 
    0, -2, -2}, {-2, -2, 0, 1, 0, 0}, {0, 0, -2, 0, 1, -2}, {0, 0, -2,
     0, -2, 1}};
{val, vec} = Eigensystem[A];
o = Flatten[Orthogonalize /@ FindClusters[N@val -> vec], 1];

It is orthonormal, and it diagonalizes A:

o.o\[Transpose] == IdentityMatrix[6]
Simplify[o.A.o\[Transpose]] == DiagonalMatrix[val]
(*True*)
(*True*)

While it works for your example matrix, it may fail for more complicated cases, so check your results.

As a further example of the catastrophic issues involved with orthogonalizing symbolic vectors, try executing the following:

Orthogonalize[Eigenvectors[#.#\[Transpose] &@RandomInteger[{-5, 5}, {6, 6}]]]

The calculation just goes on and on, because the eigenvectors are comprised of giant Root objects. So if symbolic results are what you need, you may run into trouble.


Here is a method that works when eigenvalues do not involve Root objects. Perturb symmetrically, and in such a way that equal eigenvalues become unequal (or enough do that we can get an orthogonal set of eigenvectors). Then take the limit as the perturbation goes to zero.

Here I add e to the (1,3) and (3,1) positions.

pmat = {{1, -2, e, -2, 0, 0}, {-2, 1, 0, -2, 0, 0}, {e, 0, 1, 0, -2, -2},
  {-2, -2, 0, 1, 0, 0}, {0, 0, -2, 0, 1, -2}, {0, 0, -2, 0, -2, 1}};

{pvals, pvecs} = Eigensystem[pmat];
vecs = Limit[pvecs, e -> 0]

(* Out[3691]= {{0, 0, 0, 0, -1, 1}, {0, -1, 0, 1, 0, 0}, {-1, -1, 1, -1, 1, 1},
  {2, -1, -2, -1, 1, 1}, {1, 1, 1, 1, 1, 1}, {-2, 1, -2, 1, 1, 1}} *)

Check orthogonality:

In[3692]:= vecs.Transpose[vecs]

(* Out[3692]= {{2, 0, 0, 0, 0, 0}, {0, 2, 0, 0, 0, 0}, {0, 0, 6, 0, 0, 0},
  {0, 0, 0, 12, 0, 0}, {0, 0, 0, 0, 6, 0}, {0, 0, 0, 0, 0, 12}} *)

The reason that that the orthogonalization fails is not because there is any problem performing Graham Schmidt. It is because no general symbolic solutions to polynomials beyond 5th order are known and computing eigenvalues of a 5x5 matrix is equivalent to solving a 5th order polynomial. (http://mathworld.wolfram.com/Polynomial.html ) .

The original problem had trouble getting orthogonal symbolic eigenvectors for the matrix

M = {{1, -2, 0, -2, 0, 0}, {-2, 1, 0, -2, 0, 0}, {0, 0, 1, 0, -2, -2}, {-2, -2, 0, 1, 0, 0}, {0, 0, -2, 0, 1, -2}, {0, 0, -2, 0, -2, 1}}

In spite of the fact that this is a 6x6 matrix, it's eigenvalues can be computed exactly and they are plus and minus 3.

 ev=Eigenvalues[M]

with that knowledge in hand it is trivial to compute an orthogonal basis for the degenerate eigenvalues. First note that that a basis can be computed by asking for the null space of M-lambda I for each lambda in the spectrum of M. In the case at hand the following two sets of eigenvectors span the two subspaces. The eigenvectors in one set are orthogonal to those in the other set, as they must be.

 evp = NullSpace[(M - 3 IdentityMatrix[6])]
 evm = NullSpace[(M + 3 IdentityMatrix[6])]

evp[[1]].evm[[1]]

Orthogonalization of the degenerate subspaces proceeds without difficulty as can be seen from the following.

orthogplus=Orthogonalize[evp]
orthogminux=Orthogonalize[evm]

all work quite cleanly.

If you can get the exact eigenvalues you will be able to obtain the exact eigenvectors and you'll be able to orthogonalize those that correspond to degenerate eigenvalues without difficulty. A general procedure for obtaining a symbolic orthogonal basis for any matrix M is:

M = {{1, -2, 0, -2, 0, 0}, {-2, 1, 0, -2, 0, 0}, {0, 0, 1, 
0, -2, -2}, {-2, -2, 0, 1, 0, 0}, {0, 0, -2, 0, 1, -2}, {0, 0, -2,
 0, -2, 1}};

{evals,evecs}=Eigensystem[M];

If the spectrum (evals) is not exact, STOP. There is no symbolic eigenbasis because it is not possible to compute the eigenvalues symbolically. I don't know how to tell Mathematica to tell if a spectrum is exact or not but it will be apparent by inspection. If the eigenvalues are exact they will either be rational or will involve symbolic expressions containing radicals.

orthogaleigenvectors=Orthogonalize[evecs];