I am trying to compute many interior eigenvalues of large sparse banded complex Hermitian matrices.

By the way, the (absolute value of the) matrix looks like this for the ~1500 by ~1500 case:

Here, ~ 98.8% of the elements are zero.

So far the fastest thing is just to convert to a dense representation and use the LAPACK routine underlying the `eigvals`

function. (What is more, it seems to be faster to compute *all* the eigenvalues rather than ~20% of them? Didn’t expect that.)

I have tried Arpack.jl (with shift-and-invert) and BandedMatrices.jl coupled with PyCall + `scipy.lingalg.eig_banded`

, but these attempts are at least 10 times slower. I briefly looked at some packages like KrylovKit.jl but all the sparse eigensolvers in Julia that I have come across besides Arpack.jl appear to be only good for extremal eigenvalues.

I saw the FILTLAN package which appears to be made for (something close to?) the kind of thing I am trying to do, but it has no documentation and is written in C++ and hence seems like it would be difficult to interface with my other code.

So the question: any advice on where to look for the best user-friendly eigensolver for this purpose? For now I am going with just converting to a dense representation, but a) if there is something faster for my particular kind of matrices that takes better advantage of their structure that would be cool and b) I am not sure this will be an option for larger sizes in the future.

For now I am just computing eigenvalues. Also, in case it matters, the matrices are not only banded but they are constructed from sums of Kronecker products of Hermitian tridiagonal matrices.