I am having trouble to compute eigenvectors and orthonormalize them.
I try to solve the generalized eigenvalues problem associated to matrices A and B.
A is not symmetric; B is symmetric but not necessary definit positive.
As A is not symmetric, one has to compute right and left eigenvectors, respectively X and Y.
After some tries, I managed to compute them (eigs from Arpack.jl was a fail; I ended up using ArnoldiMethod.jl with the shift and invert method; KrylovKit.jl does not apply as A is not symmetric nore hermitian).
So, I have now X and Y ans I want to bi-orthonormalize them, i.e., Y^T * B * X = Id (where ^T is the hermitian transpose) and Y^T * A * X = L with L matrix with the eigenvalues on diagonal.
I do this by computing Y^T * B * X = D and by doing for example X <= X * inv(D) I achieve to have Y^T * B * X = Id.
But Y^T * A * X works badly. The eigenvalues are on the diagonal. But extra diagonal terms are very large (1e-2 in norm) when it comes to the last eigenvalues (I compute 10 of them).
So here are my questions :
Is there another way to compute eigenvalues/vectors efficiently ?
Why is the orthonormalization working so badly ? It works well on matlab.
ERROR: LoadError: ArgumentError: Only symmetric or hermitian generalized eigenvalue problems with positive definite `B` matrix are currently supported.
Your scheme works for me with results from ArnoldiMethod.jl on benign matrices, but your remarks suggest that your case is not so easy. Have you used a small tolerance and checked that the residuals satisfy it? Experimenting with different shifts may also help.
Maybe the eigenvalues at the edges of the window are badly converged? (I have no experience with ArnoldiMethod.jl but KrylovKit.jl does ensure convergence of the requested number of eigenvectors IIRC).
A quick check is to just request some more vectors and compare the resulting matrices.