# How to improve runtime with measurements.jl?

Dear all,

I am using the measurements.jl package to calculate the error in my dataset. I have an array with voltage values which all have their own uncertainty. But when I want to calculate the mean of the array, it takes 27 seconds and allocates 805.25 MiB to do this. The array consists of around 47000 elements all of the same type. Does anyone know how to improve the performance of this operation? For an array of the same size with only Float64 elements the operation is almost done instantly. Thanks for helping me out.

1 Like

Welcome! Itâ€™s hard to say whatâ€™s causing allocations or slowing down your code if you donâ€™t share it with us - please share a MWE (minimal working example) showing the problem.

Also, have you followed most of the tips mentioned in the Performance Tips section of the manual?

1 Like
`````````julia
``````
``````using Statistics, Measurements

VoltageArray = rand(50000) #Works really fast

VoltageArray = VoltageArray .Â± 0.1VoltageArray #Works really fast

mean(VoltageArray) #Takes a long time to run
``````

This is a MWE of my problem. I hope I made it clear

1 Like

Divisions are expensive:

``````julia> @time (1 Â± 0.1) / (2 Â± .2)
0.000007 seconds (4 allocations: 192 bytes)
0.5 Â± 0.071
``````

According to this issue on their repo this is known, you may be looking for `weightedmean` instead here:

``````julia> v = rand(50_000);

julia> v = v .Â± 0.1 .* v;

julia> @time weightedmean(v)
0.001619 seconds (6 allocations: 781.438 KiB)
0.0002565 Â± 2.2e-6
``````

Though you should note:

NOTA BENE: correlation is not taken into account.

Accumulation of error bars is non-trivial, else this would already be much faster, I assume.

1 Like

Sure, divisions, but look at this:

``````jl> v = rand(100);

jl> vm = v .Â± 0.1 .* v;

jl> @btime sum(\$v)
9.409 ns (0 allocations: 0 bytes)
54.45849425120919

jl> @btime sum(\$vm)
14.700 ÎĽs (102 allocations: 6.44 KiB)
54.46 Â± 0.62
``````

Thatâ€™s a 1500x slowdown(!) Is that really expected?

1 Like

Not my words

https://github.com/JuliaPhysics/Measurements.jl/issues/25

Maybe something has changed since then, @giordano could chime in, but I doubt anything grand has changed here. Keeping track of correlations is hard.

1 Like

Yes, unfortunately thatâ€™s exepcted, I made the example of the `mean` in the issue linked above. As @Sukera pointed out, tracking correlation is hard. Itâ€™s pretty easy to write a package to propagate uncertainties super quickly ignoring correlations, this is what `Measurements.jl` did until v0.02, but thatâ€™s also incredibly dumb and useless: almost no identies would hold, for example `x + x` and `2 * x` would give you different results.

Regarding the mean in particular, note that most of the time users want to compute the weighted mean, for which `Measurements.jl` provides a specific function, which should have much more reasonable performance. Note that it ignores correlation, as warned in the docstring, because youâ€™d usually apply to a sample of independent measurements anyways.

To be clear, `Measurements.jl` is slow because of an algorithmic limitation: it uses an O(n^2) algorithm to propagate uncertainties, you can understand why `mean`/`sum` are particularly bad, and get worse and worse as the size of the vector increases. There may be clever ways to reduce the complexity of the algorithm, but I never had the time to look at it. I believe the Python package `uncertainties` now has an algorithm which is O(n), or anyways better than O(n^2), but until a few years ago it was using basically the same algorithm as `Measurements.jl` (well, historically itâ€™s the other way around). If anyone is willing to help, Iâ€™d be glad to hear from them.

4 Likes

Okay, I had no idea, and thought is was the same as interval arithmetic.

No, I believe interval arithmetic doesnâ€™t track correlations at all, which is probably fine for what itâ€™s applied to

1 Like

Bummer, that project has a weird custom license It may be a BSD license according to git history, but Iâ€™m not a copyright lawyerâ€¦

I think it is a 3 clause BSD license The 3-Clause BSD License | Open Source Initiative. What that means for a port I donâ€™t know, but itâ€™s that one,

Then we may be in luck - that license only places restrictions on redistributions in binary or source form (a port is as far as I know neither), as well as not endorsing our port with their name (i.e. we should be fine if we write something like â€śport inspired byâ€ťâ€¦).