Hi everyone!
I’m currently trying to look into the runtime complexity of some code which uses an eigendecomposition, and I don’t understand what I’m seeing. Reduced to an MWE, I’m doing the following:
- Generate a random n*n Matrix{ComplexF64}
- Measure the runtime of eigen() applied to this matrix
- Plot this as a function of n
(Code at the end of this post)
From Wikipedia (Eigenvalue algorithm - Wikipedia), I expect this to scale as O(n^3), since I’m getting eigenvectors as well. However, the data I get seems to follow a O(n^2) scaling - why is that?
using LinearAlgebra
using Plots
ns = 50:50:500
walltimes = Float64[]
A = rand(ComplexF64, 10, 10)
println(@elapsed vals, vecs = eigen(A))
println(vals, vecs)
function measure(n::Int)
t = 0.0
for i in 1:10
B = rand(ComplexF64, n, n)
t += @elapsed eigen(B)
end
return t / 10
end
walltimes = measure.(ns)
plot(ns, walltimes, xscale=:log10, yscale=:log10, marker=:circle, label="Eigen", legend=:topleft)
plot!(ns, walltimes[end] * (ns / maximum(ns)) .^ 2, label="O(n^2)")
plot!(ns, walltimes[end] * (ns / maximum(ns)) .^ 3, label="O(n^3)")
savefig("eigenscaling.png")