using MonteCarloMeasurements
using Unitful
function f1(Vi)
if Vi ≤ 0.0
return 0.0
elseif Vi ≥ 1.0
return 1.0
else
Vi
end
end
function f2(Vi)
if Vi ≤ 0.0u"V"
return 0.0u"V"
elseif Vi ≥ 1.0u"V"
return 1.0u"V"
else
return Vi
end
end
then f1(0.0..1.0) succeeds, but f1(-0.5..1.5) fails. This is understandable as the second distribution spans the discontinuities. If I then register_primitive(f1) and subsequently execute f1(-0.5..1.5), it succeeds and inspection of the result shows it is as I expect.
However, if I try the same thing with f2 (which I intend to be the exact same function except that now I’m using Unitful and everything should be in units compatible with u"V"), f2 fails, regardless of register_primitive(f2).
EDIT: Adding calls to f2: f2((0.0..1.0)u"V") works, f2((-0.5..1.5)u"V") always fails.
The problem here is that (-0.5..1.5)u"V" is creating an object of type Quantity{Particles}, whereas register_primitive only overloads f2 for Particles, so that second method never gets called. I don’t immediately see a very clean workaround for this from the MonteCarloMeasurements side, but it would be worth opening an issue there, maybe other people have better ideas. As a quick solution to your case, you could overload Quantity{Particles}, with the following definition:
function f2(x::Quantity{Particles{T,N}}) where {T,N}
u = unit(x)
x = ustrip(u, x)
v = map(x -> f2(x * u), Vector(x))
return Particles{eltype(v),N}(v)
end
, but I agree that that’s definitely cumbersome.
Edit:
I got the definition wrong at first, now it works, but the result throws an error when printed.
The code Simeon shared is close to a register_primitive for the special type Quantity{Particles}. Hopefully, we should be able to do this a bit more automatic, possibly while working with the reverse representation, i.e., Particles{Quantity}.
Here’s my workaround for simple Monte Carlo with units:
using Unitful
using Plots
using Distributions, Random
function systematic_sample(d, N=10000)
e=0.5/N
y = e:1/N:1
o = map(y) do y
quantile(d,y)
end
permute!(o, randperm(N))
end
U(min,max) = systematic_sample(Uniform(min,max))
hist(X) = histogram([x.val for x in X])
function f2(Vi)
if Vi ≤ 0.0u"V"
return 0.0u"V"
elseif Vi ≥ 1.0u"V"
return 1.0u"V"
else
return Vi
end
end
p = U(-0.5,1.5)u"V"
p2 = f2.(p)
extrema(p2)
hist(p2)
I’ve stolen your systematic sample function (simplified a bit) and just went to ordinary broadcasting. This lets me hang on to Unitful and I have a syntax that is simple and readable. Because I have some piece wise linear things (which is really how I got to the above MWE) I’d already lost some of the speed advantage of MCM over the brute force method.
Another workaround would be to run your computations with scalar units first and then when it’s verified to be correct, you can switch to uncertain values for the uncertainty propagation?
Let me describe my present application. I’m using Julia in place of a spreadsheet for electronic design calculations. To that end, I want to have concise, readable representations of the relationships that include units (no Mars Orbiter mistakes here, please) and include tolerances (which I want to be aware of in the first pass, not as an afterthought).
After looking at Measurements.jl, MonteCarloMeasurements.jl, IntervalArithmetic.jl, and probably some things I can’t remember, I think I want the following for this work:
Independent distributions that don’t forget who they are. In other words, if a and b are distinct distributions with the same properties, then I expect a/a to be one with zero deviation. Interval arithmetic doesn’t satisfy this.
I want to be able to choose distributions - I generally don’t have the input distributions, only the specification limits, so I may choose to be somewhat conservative and use a uniform dist like in my MWE, or maybe a truncated ±3σ normal. Measurements.jl doesn’t allow that choice.
I have functions which can take a variety of forms from simple linear equations or piece-wise linear approximations or non-linear things including saturation effects.
I want to have physical units attached. MonteCarloMeasurements.jl does this if the functions are simple.
Speed is not a big deal (right now). I am doing related simulations, some with Julia-based tools, some with other circuit-specific tools, but this work is more likely to inform what corners I simulate, at least for now.
One additional thing I would like is to be able to get both statistical values and worst-case values in one pass, where by worst case I mean the calculation over the product of all the min/max values (not always true though) of the input values. Extrema of results based on uniform dist inputs gets close.
I’m noodling on the idea of a struct that carries the upper and lower limits and the cloud of distribution points, and an application function (tmap for tolerancing map?) that does a worst-case calculation based on a Base.Iterators.product of the input spec limits, and also does the Monte Carlo calc, and then produces a similar struct for the computed output. I think I’ll let that idea simmer while I go back to work!
Thank you for your work on MonteCarloMeasurements, it’s a cool package!
I’ve added some support for Unitful in this PR
I’m not a user of Unitful myself so I’m not sure what to test, but anyone interested is welcome to try it out and provide some tests cases if it fails.