I am pretty new to Julia, and I would appreciate your help in the following.

I have a complex, non-symmetric yet sparse matrix (800,000x800,000 with ~10M nnz) that is stored in a file in COO format, i.e., {row, col, real(a(row,col)), imag(a(row,col))}. I want to test Julia’s sparse eigenvalue solvers, and in that light, I would appreciate your help on how to load this file and parse it to an eigensolver in Julia (BTW, I want to compute ~100-200 eigenvalues only, and any suggestion on a suitable choice of an eigensolver will be much appreciated too!).

Thank you in advance for your consideration, and apologies if the above is a naive question to the community!

the easiest way to load this data into a sparse matrix representation in julia is probably via two steps:

step: load the values in memory as they are (i.e. in COO format). This step depends on the file format (where by format here I mean the kind of file, e.g. .csv, .mat etc.). To load a CSV file, use CSV.jl. This should give you access to row, col, r, i (just my naming for the four columns). You then probably want to turn the two vectors holding real and imaginary parts into a single vector containing complex numbers. You can do that like: val = r + im * i

step: Julia has support for sparse arrays in its standard library. It stores the data in CSC format, but its very easy to create a sparse matrix from COO format. The details are here: Sparse Arrays · The Julia Language but given the vectors above, it is simply m = sparse(row, col, val) (make sure to load the SparseArrays package via using SparseArrays). m is then a sparse matrix object (stored as CSC).

From there you just pass that matrix object to an eigensolver that supports sparse matrices. Unfortunately the LinearAlgebra standard library and its eigen() function do not support sparse matrices. But the Arpack.jl package and its eigs() function does. Not sure whether there are better/more suitable implementations.

ARPACK is the “classic” 1990s Fortran library for the (implicitly restarted) Arnoldi eigenvalue algorithm, and Arpack.jl gives a Julia interface to this. These days, however, it may be superseded by pure-Julia (and hence more flexible, not necessarily any faster) packages like ArnoldiMethod.jl and KrylovKit.jl. YMMV.

IterativeSolvers.jl also has the LOBPCG algorithm (for Hermitian matrices, not @Eustace’s case). One algorithm/library we don’t currently seem to have is FEAST — it would be nice to have Julia bindings for this. (There are a couple of experimental Julia-native FEAST-like packages, but they don’t seem very mature yet, and FEAST is an extremely tricky algorithm to implement well.)

I’m also noticing that people are starting to do research on new eigensolver algorithms using Julia, e.g. here and here, so hopefully we will get even more good stuff in the future.

Thank you all for your input, I appreciate it a lot! I will give it a try now.

As a side note, I’m very familiar with FEAST (especially its v4 utilizes inexact solvers particularly suitable for very large problems, I think), and it’d be nice to have Julia bindings for this indeed!

Prof. Krysl, I want to compute, e.g., 100-200 with smallest magnitude or 100-200 evals close to a given shift.