Hi there!
I’m trying to solve an eigenvalues problem. Originally I have a 792x792 matrix that I turned into a Sparse Array. For that purpose I’m using Arpack and eigs( ). The dimension of that array is 792 so I expect that number of eigenvalues, but I get 791. I’m specifying de number of eigenvalues (nev = 792), but I get this warning:
┌ Warning: Adjusting nev from 792 to 791
└ @ Arpack ~/.julia/packages/Arpack/UiiMc/src/Arpack.jl:99
I mean, I tried the same in Mathematica and I got 792 values without any problem.
My code is some heavy, so I made another one with a 3x3 array. And I get the same problem.
using Arpack
foo = [[1, 2, 3] [5, 6, 7] [8, 9, 10]] #dummy array
eigs(foo, nev = 3)[1] #I know and I tried in Mathematica this one and I got 3 eigenvalues
But the same warning appears again!
┌ Warning: Adjusting nev from 3 to 1
└ @ Arpack ~/.julia/packages/Arpack/UiiMc/src/Arpack.jl:99
Does anyone know why I’m getting less eigenvalues that I expect?
Is there any way to avoid that auto adjusting of the nev?
By the way, I’m using version 1.1.0 of Julia on my Mac.
Yes, I totally agree with you. But 792 is just one case, my code should be able to solve arrays of bigger dimensions than that, in some cases I will need Sparse Arrays to deal with a lot of zeros. That’s my point.
For that reason I’m looking for a way to get all my eigenvalues from a Sparse Matrix.
AFAIK, ARPACK simply isn’t made for what you’re asking. It is a tool to calculate a bunch of extremal eigenvalues for huge sparse matrices and not for calculating all of them. Getting all eigenvalues of a large sparse matrix is an expensive problem!
Anyways, there are iterative solvers implemented in Julia that give you what you want. For example, using KrylovKit.jl I get: