Issue computing eigenvalues/vectors and orthonormalization

Hi there,

I am having trouble to compute eigenvectors and orthonormalize them.

I try to solve the generalized eigenvalues problem associated to matrices A and B.

A is not symmetric; B is symmetric but not necessary definit positive.

As A is not symmetric, one has to compute right and left eigenvectors, respectively X and Y.

After some tries, I managed to compute them (eigs from Arpack.jl was a fail; I ended up using ArnoldiMethod.jl with the shift and invert method; KrylovKit.jl does not apply as A is not symmetric nore hermitian).

So, I have now X and Y ans I want to bi-orthonormalize them, i.e., Y^T * B * X = Id (where ^T is the hermitian transpose) and Y^T * A * X = L with L matrix with the eigenvalues on diagonal.

I do this by computing Y^T * B * X = D and by doing for example X <= X * inv(D) I achieve to have Y^T * B * X = Id.

But Y^T * A * X works badly. The eigenvalues are on the diagonal. But extra diagonal terms are very large (1e-2 in norm) when it comes to the last eigenvalues (I compute 10 of them).

So here are my questions :

  • Is there another way to compute eigenvalues/vectors efficiently ?
  • Why is the orthonormalization working so badly ? It works well on matlab.

KK does not require this!!

I have the following error :

ERROR: LoadError: ArgumentError: Only symmetric or hermitian generalized eigenvalue problems with positive definite `B` matrix are currently supported.

for GEV, you need hermitian B.

The B matrix I use is hermitian.

issymmetric(B) = true

Your scheme works for me with results from ArnoldiMethod.jl on benign matrices, but your remarks suggest that your case is not so easy. Have you used a small tolerance and checked that the residuals satisfy it? Experimenting with different shifts may also help.

Maybe the eigenvalues at the edges of the window are badly converged? (I have no experience with ArnoldiMethod.jl but KrylovKit.jl does ensure convergence of the requested number of eigenvectors IIRC).

A quick check is to just request some more vectors and compare the resulting matrices.

Out of curiosity: How large are your matrices?

You mean with symmetric B and non symmetric A ?

I did not try on other matrices. I should give it a try.

Yes, I could compute more eigenvalues so that the ones I want are more converged.

My matrices are not so large, 30,000 x 30,000, and very sparse.

Ok it seems to work. Thanks !

I also tighten a bit the tolerance and it also improves the results.