Can the result of @time measure the spatial complexity of the algorithm?

To demonstrate how the garbage collector’s inconsistent timing throws off any measure of memory usage, here’s a version of the original function where Base.gc_live_bytes is used to measure the change in heap memory occupation:

julia> function myfunc_checkgc(n::Int64)
           start = Base.gc_live_bytes()
           D=20
           a=rand(D,D)
           for i in 1:n
               a*a
           end
           Base.gc_live_bytes() - start
       end
myfunc_checkgc (generic function with 1 method)

julia> @time myfunc_checkgc(100)
  0.000166 seconds (101 allocations: 328.250 KiB)
336128

julia> @time myfunc_checkgc(100)
  0.000140 seconds (101 allocations: 328.250 KiB)
336128

julia> @time myfunc_checkgc(1000000)
  2.711456 seconds (1.00 M allocations: 3.099 GiB, 33.98% gc time)
-29624217

julia> @time myfunc_checkgc(1000000)
  2.403197 seconds (1.00 M allocations: 3.099 GiB, 31.69% gc time)
24925936

julia> @time myfunc_checkgc(1000000)
  2.410707 seconds (1.00 M allocations: 3.099 GiB, 31.21% gc time)
24845856

When the number of allocations are small enough, the garbage collector often doesn’t run over the course of the method call and you get mostly consistent results, but when the garbage collector triggers in the middle of the method call, you can’t know how much memory was actually occupied. You can end up with less occupied heap memory by the end.

The @time number would reflect how much more heap memory was occupied by the method call if the garbage collector isn’t triggered, but that’s just not something you can rely on happening. You could treat it as a very generous upper bound for 1 call.

1 Like