I want to measure the total amount of time nedded to perform a complete loop. I tried
@time for i in 1:10000
...
...
end
and it returned .988083 seconds (57.58 M allocations: 39.867 GiB, 0.14% gc time) when the code ran about 15 minutes to complete the loop and not 0.988083 seconds as indicated. How can I measure time correctly?
The loop calls a lot of functions I created outside of the loop and a dataset I have Since the code with the functions is large, I thought that putting just the command Iām using to measure time would be enough to know if Iām using the right command or placing wrongly @time
This is a big of a tangent, but you shouldnāt be doing heavy computations in a globally scoped for loop as this will be quite bad for performance and memory usage. Consider using a let block or a function (and avoid global variables).
Just out of curiosity, are you using Juno? I saw it sometimes in Juno, that output of @time is being partially āeatenā in console, I guess itās some weird bug.