Optimizing Sparse Matrix Exponential Computation in Julia

I’m trying to compute the sparse matrix exponential of large matrices (at least 10,000 x 10,000) using Julia packages like Expokit, FastExp, and ExpmV. However, I’m finding that the computation time is significantly longer compared to using Python’s scipy.sparse.linalg.expm_multiply.

For example, I’m using the function expmv(1.0, M, rho_vectorized), where M is a sparse matrix.

What methods or strategies can I use to speed up the computation of the sparse matrix exponential in Julia? Any advice or tips would be greatly appreciated!

Have you tried using ExponentialUtilities.jl? That’s the standard package for this.

1 Like

Thank you for your response. I found a bug in my code that caused it to run slower than Python. However, when I use ExponentialUtilities.jl, it is very quick, but the result is incorrect. Maybe I use a wrong function. I will check it again. Thx.