[ANN] ArnoldiMethod.jl v0.4

Hi all, I’ve release ArnoldiMethod v0.3.5 and v0.4.0 recently, after a relatively long break from writing Julia code.

See GitHub - JuliaLinearAlgebra/ArnoldiMethod.jl: The Arnoldi Method with Krylov-Schur restart, natively in Julia.

ArnoldiMethod.jl can be used to compute eigenvalues and eigenvectors (or Schur vectors) of large and sparse matrices. If you’ve used MATLAB, you may know the eigs function, which does something similar.

The package implements the Krylov-Schur algorithm, which is a slight improvement over the Implicitly Restarted Arnoldi Method you may know from ARPACK and MATLAB’s eigs function.

The ArnoldiMethod.jl package should be much more stable than Arpack.jl, and slightly more stable than KrylovKit.jl’s eigsolve / schursolve functions. It should also have better performance than KrylovKit.jl (mostly because it pre-allocates and works in-place as much as possible). If not, open an issue.

Finally, ArnoldiMethod.jl is I believe the only implementation of a partial Schur decomposition where everything is implemented natively in Julia, meaning that it works on arbitrary number types (I’ve only tested DoubeFloats.jl, BigFloat).

There are no algorithmic differences between v0.4 and v0.3. It’s just that v0.4 exports fewer names.

What’s new in both v0.3 and v0.4:

  • You can provide an initial guess partialschur(A; v1=...), which is the vector that induces the Krylov subspace.
  • You can pre-allocate large arrays and temporaries using the ArnoldiWorkspace struct, and run the algorithm in-place using partialschur!
  • You can reuse an existing partial Schur decomposition in an ArnoldiWorkspace and continue to compute additional eigenvalues and Schur vectors.
  • Stability of restart is improved: converged eigenvectors with eigenvalues far away from the target are properly purged (= removed from the Krylov subspace) so that it’s unlikely the algorithm has to do computations on an array with relatively very big and really tiny numbers, which otherwise leads to loss of precision. Also, in one crucial bit where large and tiny numbers are expected, Given’s rotations are now used instead of Householder reflections.
  • Fixed a few bugs, mostly related to locking of converged vectors

What’s still not in v0.3 and v0.4 after all these years:

A utility function like eigs that directly returns eigenvalues and eigenvectors. Right now ArnoldiMethod.jl is still a two-step method: first compute a Schur decomposition, then transform it into an eigendecomposition. I’m open for PRs.

Give it a try!


Concerning the desirable extensions: for the “vibration” problem (symmetric generalized stencil, looking for smallest eigenvalues), the package GitHub - PetrKryslUCSD/VibrationGEPHelpers.jl: Generalized Eigenvalue Problem helper functions wraps ArnoldiMethod.jl (and other similar packages), and so provides nearly uniform interface similar to eigs.