Calculating the `k` smallest positive eigenvalues of a matrix with ARPACK

Please consider writing a blog post about Hessenberg factorizations: what they are useful for, and how to use them in Julia. Your PR adds a very important capability to LinearAlgebra, but I think Hessenberg is one of the lesser-known factorizations.

Also, if it is symmetric and of size 10000, you should also be able to get away with computing all eigenvalues. For symmetric matrices, the dense algorithms can be pretty fast, and it is possible to select to compute only a range of eigenvalues. I have little to no experience with this myself, but I know this methods exist in LAPACK and are exposed in Julia.

Also, if you do want to continue with shift and invert, for a symmetric matrix the hessenberg factorization suggested by @stevengj amounts to bringing it into a SymTridiagonal form, and it is the first step into computing the dense eigenvalue decomposition. I am not sure if that step is also explicitly exposed in Julia, I guess @stevengj knows better?

1 Like

Yes, in Julia 1.3 if you do hessenberg(Hermitian(A)) it will form a real SymTridiagonal Hessenberg factor explicitly:

julia> hessenberg(Hermitian(rand(5,5)))
Hessenberg{Float64,SymTridiagonal{Float64,Array{Float64,1}},Array{Float64,2},Array{Float64,1},Bool}
Q factor:
5Γ—5 LinearAlgebra.HessenbergQ{Float64,Array{Float64,2},Array{Float64,1},true}:
  0.527746    0.453729  -0.641028   -0.323568  0.0
  0.425025    0.126814   0.713474   -0.542427  0.0
  0.0689248  -0.820645  -0.27317    -0.497162  0.0
 -0.73218     0.323405  -0.0735948  -0.5949    0.0
  0.0         0.0        0.0         0.0       1.0
H factor:
5Γ—5 SymTridiagonal{Float64,Array{Float64,1}}:
 -0.34423   -0.361886    β‹…          β‹…          β‹…      
 -0.361886  -0.654238  -0.29319     β‹…          β‹…      
   β‹…        -0.29319    0.460663   0.698886    β‹…      
   β‹…          β‹…         0.698886   2.33577   -1.54195 
   β‹…          β‹…          β‹…        -1.54195    0.361714

julia> hessenberg(Hermitian(rand(ComplexF64, 5,5)))
Hessenberg{Complex{Float64},SymTridiagonal{Float64,Array{Float64,1}},Array{Complex{Float64},2},Array{Complex{Float64},1},Bool}
Q factor:
5Γ—5 LinearAlgebra.HessenbergQ{Complex{Float64},Array{Complex{Float64},2},Array{Complex{Float64},1},true}:
  0.136178-0.149696im  …  -0.455647-0.203191im  0.0+0.0im
 -0.530702+0.110264im     -0.285712-0.428174im  0.0+0.0im
 0.0588643-0.625142im     -0.136137-0.367904im  0.0+0.0im
  0.453133+0.256218im     -0.439867-0.372513im  0.0+0.0im
       0.0+0.0im                0.0+0.0im       1.0+0.0im
H factor:
5Γ—5 SymTridiagonal{Float64,Array{Float64,1}}:
 -0.434709  -0.181771    β‹…          β‹…         β‹…      
 -0.181771   0.278323  -0.758955    β‹…         β‹…      
   β‹…        -0.758955   1.01864    1.61074    β‹…      
   β‹…          β‹…         1.61074    1.07538  -1.98639 
   β‹…          β‹…          β‹…        -1.98639   0.958969
2 Likes

Thank you for your comments, guys :slight_smile: Sorry for the delay but I was on vacation.

Thanks for posting your data.
Forwhatitsworth, the eigenvalues of M^-1 A all lie ~ on the imaginary axis, |re| < 2e-10;
the imaginary parts are Β± 2.5e2 … 2.7e6, i.e. condition number 1e4.
A stem-leaf plot of the exponents:

2 2
3 3 3 3
4 4 4 4 4 4 4 4 4 4 4 4 4 4 4
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5
6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6

(I work in python, not Julia, sorry. Running the ARPACK wrapper scipy.sparse.linalg.eigs on M^-1 A dense,
both SM and shift-reduce, i.e. LM with sigma=0, work with tol 1e-4;
with tol 1e-6, SM doesn’t converge.)

Now it’s easy to generate test matrices with whatever spectrum you like –
but what’s reasonable ?
A wiki for test cases / plots of real spectra would be welcome.

Note that discourse.julialang.org is really focused on using Julia β€” it is a not a general help forum for numerical linear algebra. Also, you generally shouldn’t resurrect old discussion threads to raise entirely new questions.

1 Like