Eigen produces non-normalized eigenvectors

I realized that, when using LinearAlgebra.jl’s eigen function to solve a general eigenvalue problem for a matrix pencil (A, B), the eigenvectors returned do not seem to be normalized in any way. The standard in the numerical linear algebra community is usually to have them B-orthonormal, i.e., vecs[;, i]'B * vecs[:, j] = \delta_{ij} where vals, vecs = eigen(A, B). I believe I used to rely on this property in previous Julia codes I developed. So I wonder, has the convention of the eigen implementation changed, and if so, to what, and why?

Edit: I realize that the issue also affects standard eigenvalue problems for which the eigenvectors returned are also non-normalized. I am convinced those vectors used to be normalized and the convention of the implementation was changed.

Usually the eigen vectors are indeed normalized (at least for regular eigenvalue problem). Can you give a simple example where this is not the case?

Recently, there was a longer thread where some numerical instability caused weird results in eigen. Maybe your issue is related?

using Random, LinearAlgebra;
Random.seed!(1);

A = rand(4, 4); A = A + A'; rank(A)
4

B = rand(4, 4); B = B + B'; rank(B)
4

vals, vecs = eigen(A);
[norm(A*vecs[:, k] - vals[k] * vecs[:, k]) / abs(vals[k]) for k in 1:4]
4-element Vector{Float64}:
 7.371902780897737e-15
 8.898479422775282e-15
 2.5101995328536974e-15
 4.4034817891396797e-16
norm(vecs'vecs - I)
4.3860879188693715

vals, vecs = eigen(A, B);
[norm(A*vecs[:, k] - vals[k] * B * vecs[:, k]) / abs(vals[k]) for k in 1:4]
4-element Vector{Float64}:
 4.572213844298374e-15
 5.133885946016943e-15
 1.4300429610170643e-15
 1.4300429610170643e-15
norm(vecs'B * vecs - I)
2.7484683851305776

Running your example, I get orthogonal eigenvectors from A, i.e., vecs' * vecs ≈ I.

Your expectation of vecs[;, i]'B * vecs[:, j] = \delta_{ij} is only possible if B is positive definite. Changing your code to B = B * B', I get B-orthogonal eigenvectors from (A, B).

1 Like

Weird. Running this, I get numerically orthonormal eigenvectors for the ordinary eigenvalue problem. So I can’t reproduce it. You can fail to get orthonormal eigenvectors for symmetric problem if you ignore symmetry and do the nonsymmetric QR iteration and then the standard eigenvector computation. It can happen in the case of extremely tight clusters of eigenvalues. But I have looked into this before and I’m pretty sure eigen looks for symmetry using a tolerance, even if you don’t wrap your matrix with Hermtian or Symmetric, and should do the right thing in this case. It would be interesting to see what you are getting for vecs and vals' * vals.

I do get the same results that you did on the generalized eigenvalue problem. However, I believe there is a mathematical reason there. The eigenvectors for a symmetric eigenvalue problem can be chosen to be orthogonal with respect to the inner product \langle x, y \rangle = y^T B x if B is positive definite. There is no reason to expect this if B is indefinite. (And it’s not even an inner product in that case). If I change B = B + B' to B = B * B', I do get orthonormal eigenvectors with respect to that inner product.

I restarted a REPL instance, and now everything works. I have no clue what happened. Anyway, thanks.

1 Like