For some numerical simulations I run, I need to construct a matrix and obtain its leading eigenvalue. I will make use of both of them.
I was using the LinearAlgebra
package, and computing the leading eigenvalue as maximum(real(eigvals(matrix)))
. When profiling the code, I noticed that this computation takes a noticeable amount of time, especially for larger matrices, so I am trying to speed it up.
I cannot use eigvals!
because I need the matrix later and do not want to override it. Furthermore, I only need one eigenvalue. I found that Arpack.jl
can compute the leading eigenvalue directly. It is way faster, but it fails to diagonalize some matrices, with the error
┌ Error: XYAUPD_Exception: Maximum number of iterations taken. All possible eigenvalues of OP has been found.
│ IPARAM(5) returns the number of wanted converged Ritz values.
Looking at the package issues, it seems that version 0.5.4 of the package is broken and leads to this problem. Other users have reported this problem, see 1, 2. Might be that some matrices actually cannot be solved by this method (the ones I am using are dense and large), but the error is not clear about it and the package does not seem to be maintained anymore. Other alternatives such as KrylovKit.jl
or Arnoldi.jl
also do not have any recent commit, nor answer to open issues, so they all seem to be unmaintained.
So I wonder what the supported, most efficient way of computing just a single eigenvalue is, and if there’s any plan to re-take Arpack.jl
package again. The speed difference is noticeable, especially for large matrices.
Thank you for your help!