Are eigenvectors of real symmetric or hermitian matrix computed from `eigen` all orthogonal

In Fortran/C, for a real symmetric or complex Hermitian matrix A, I can call Lapack dsyev /zheev and the resulting eigenvectors are normalized and mutually orthogonal. What method is Julia using for diagonalizing such matrix? I search the source of eigen.jl but did not find such dsyev or zheev function calls.

So when one use the method eigen() from LinearAlgebra to calculate the eigensystem,

eigvals, eigvecs = eigen(A)

Are all calculated eigenvectors eigvecs normalized and mutually orthogonal, meaning

norm(eigvecs*eigvecs' - Matrix(I, size(A, 1), size(A,1))) < tol

where tol is a very small number, say 1E-15 ?

Short answer is yes. This eigen(A) eventually calls LAPACK.ggev! which gives both the eigenvalues and eigenvectors.

1 Like

Also, it might be worth noting that the specific behavior of eigen will depend on the structure of your matrix, and what low level work it actually does are an implementation detail, and won’t be the same for all types of matrices.

The answer is probably yes. But I do not see why LAPACK.ggev! call guarantees the eigenvectors are orthonormal.

Yes, but what matters here is the implementation detail. I need to know what basic Lapack or other function calls the command eigen() invoked to be convinced that the eigenvectors are mutually orthogonal.

The reason the eigenvectors are orthonormal is that eigenvectors are always orthonormal. Otherwise, they aren’t eigenvectors.

Also for symmetric/hermetian matrices, it will actually call LAPACK.sygvd!

Not for Hermitian matrices. For the Hermitian case it calls syevr (i.e. dsyevr for real-symmetric and zheevr for complex-Hermitian), which guarantees that the eigenvectors are orthonormal (up to floating-point accuracy).

2 Likes

Well, no. Only eigenvectors associated with different eigenvalues of normal matrices are orthogonal. For repeated eigenvalues, they can be arbitrary. As far as I know eigen checks if the matrix is hermitian up to machine precision and dispatches to the appropriate routine. You can bypass this by wrapping the matrix in Hermitian.

5 Likes

To clarify, (the basis of) eigenvectors of normal matrices (e.g. Hermitian) can always be chosen orthonormal even for repeated (multiplicity > 1) eigenvalues, and LAPACK guarantees an orthonormal choice for its Hermitian-eigensolver routines.

3 Likes

Right, I should have specified.

Btw I’m not sure whether it actually happens that ggev does give (very) non-orthonormal eigenvectors for (close to) hermitian matrices or if the non-hermitian solver tries to keep eigenvectors as orthogonal as they can be anyway for stability.

Since the algorithm starts by finding the Schur form, I think it follows that it should stay close to orthonormal for nearly Hermitian matrices (where the Schur form is nearly diagonal), even for repeated eigenvalues?

Turns out no, arbitrary small perturbations of matrices with degenerate eigenvalues have non-orthogonal eigenvectors:

julia> A = eigen(I+1e-14*randn(2,2)).vectors; norm(A'A-I)
1.2919032002219082

That’s just because those eigenvectors are those of randn(2,2) which have no reason to be orthonormal. For orthogonality to hold I think you need the perturbation to be much smaller than the eigenvalue gap. In practice however this kind of trouble probably doesn’t happen very often since to get repeated eigenvalues of hermitian matrices you usually need some symmetry, so the only way you could run into trouble is if that symmetry is not reflected at the floating-point arithmetic level. Anyway, the short answer is: wrap in Hermitian.

3 Likes

I had the same problem of diagonalizing a matrix with many 0 eigenvalues, which is the story behind this question. So to solve the problem of realistic numerical error, It seems that I just need to wrap Symmetric() for real and Hermitian() for complex to make sure eigenvectors are orthonormal.

Let me to see if I get everything correct. To battle numeric error of real symmetric matrix or complex hermitian matrix and make sure the eigenvectors are orthonormal, the options are

  • eigen(Symmetric(A)) or eigen(Hermitian(A)) for real case
  • eigen(Hermitian(A)) for complex case

If S is a sparse matrix, to achieve the same goal, one just need to use Matrix to convert the sparse matrix to a normal matrix type, wrap it using corresponding Symmetric/Hermitian method and then use eigen().

If S is sparse, you probably don’t want to be using eigen since it will be really inefficient. You can take Symmetric of a sparse matrix without problems, (but eigen(that) doesn’t work because it would be really inefficient to do so).

To get the eigensystem of a sparse matrix S, besides using eigen(Symmetric(Matrix(S))), do you have better (more efficient) solutions? Apparently eigen(Symmetric(S)) does not work since no method matching.

I think you will want to use Arpack.jl for this. I’m not 100% sure why we don’t have these methods in Base.

ARPACK uses iterative method to perform sparse diagonalization and is suitable to get only a small subset of eigensystem. The performance of ARPACK as of now seems not so good, it consumes way too memory and is also very slow compared to eigs in MATLAB.

from what I can tell, eigs in MATLAB uses ARPACK as well, and also only gives you the largest eigenvalues.

For old version of MATLAB ,Yes. But later MATLAB adopted a new method as far as I know.