I’ve been learning Julia in the past week and in the process, I have re-written one of my programs, which I previously implemented in MATLAB. I noticed the program in Julia was quite noticeably slower than the MATLAB version and, using the profiler, I discovered that the eigensolver “eigen()” takes about 2x longer than the “eig()” solver in MATLAB. My program involves diagonalizing many MxM (Hermitian) matrices, where M ~ 10^2-10^3.
One can compare times for the two eigensolvers to diagonalize random Hermitian matrices in
Julia:
julia> using LinearAlgebra
using MKL
using BenchmarkTools
A = rand(1000, 1000) + im * rand(1000, 1000);
A = A + A';
@btime eigen(A);
487.983 ms (16 allocations: 46.57 MiB)
or .487983 seconds. And in MATLAB:
>> A = rand(1000, 1000) + 1i * rand(1000, 1000);
A = A + A';
tic
eig(A);
toc
Elapsed time is 0.199620 seconds.
Although both times are longer, the discrepancy remains for totally random (i.e. not necessarily Hermitian) matrices
Julia:
julia> A = rand(1000, 1000) + im * rand(1000, 1000);
@btime eigen(A);
2.722 s (24 allocations: 36.46 MiB)
MATLAB:
>> A = rand(1000, 1000) + 1i * rand(1000, 1000);
tic
eig(A);
toc
Elapsed time is 1.660253 seconds.
Note that I’ve used the MKL.jl package, which I learned to do from this thread. This gives ~2x speed up for the eigen() computation in Julia. Is there a relatively easy way to speed up the computation in Julia? Could this issue be particular to my machine? I’m happy to provide any more needed details.