Julia is slower than MATLAB at diagonalizing matrices

I guess that with one or no output argument Matlab’s eig just computes the eigenvalues and does not particularly compute/store the eigenvectors. In contrast, in Julia the eigen computes full eigendecomposition, that is, both eigenvalues and eigenvectors. If only eigenvalues are required, use eigvals instead. The performance of Matlab and Julia are then pretty much identical on my old laptop (running Linux). In fact, Julia even a bit faster.

In Matlab 2022a (and yes, I did run the code a few times):

>> A = rand(1000, 1000) + 1i * rand(1000, 1000); A = A + A';
>> tic, eig(A); toc
Elapsed time is 0.401051 seconds.
>> maxNumCompThreads

ans =

     2

and in Julia 1.7.2

julia> using LinearAlgebra

julia> using MKL

julia> using BenchmarkTools

julia> A = rand(1000, 1000) + im * rand(1000, 1000);

julia> A = A + A';

julia> A = Hermitian(A);

julia> LinearAlgebra.BLAS.set_num_threads(2)

julia> @btime eigen(A);
  628.196 ms (16 allocations: 46.57 MiB)

julia> @btime eigvals(A);
  297.065 ms (13 allocations: 16.05 MiB)

Indeed, I am afraid that even the title of the post mislead us a bit because in your Matlab code you are really just computing the eigenvalues, which are not quite sufficient to perform diagonalization of a matrix.

32 Likes