Define a distribution from a given distribution

I would like to define two distributions:

p = Uniform(0,1)
p2 = 1-p

give me typrerror.

ERROR: MethodError: no method matching -(::Int64, ::Uniform{Float64})

I also cannot sum two distributions:

c = Normal(20,5) + Normal(10, 6)

I found this Issue:
Is it possible to build the new distribution in different ways?
Has Turing a better support for operations among distributions?

1 Like

AFAIK these kind of operations are not supported by Distributions.jl at the moment.

What do you want to do with the resulting new distribution? Eg generate random draws, calculate pdf/cdf/quantiles, etc?


I need them to use in Turing models, samples generation and posterior inference.

The problem here is distinguishing between a distribution and a sample from a distribution. Even if you could define something like

p2 = (1-x for x in p)

then there would still be no way to distinguish p from p2.

There are two ways out:

  1. Define a (degenerate) distribution over pairs. I don’t think Distributions will help you here, but you’ll be able to do this in MeasureTheory.jl.
  2. Work in terms of samples

For the latter, it’s easy in MonteCarloMeasurements:

julia> using MonteCarloMeasurements

julia> p = Particles(Uniform())
 0.5 ± 0.289

julia> p2 = 1-p
 0.5 ± 0.289

julia> p+p2
1 Like

Chances are that your best bet is to circumvent the problem. What are you trying to do, as Tamas asks?


For Turing, you should be using the tilde operator (~) rather than assignment (=).

Your first example would be

p ~ Uniform(0,1)
p2 = 1-p

which will work in a Turing @model block. For your second example, you would have to draw the two variables separately and assign them before you add them:

a1 ~ Normal(20,5)
a2 ~ Normal(10, 6)
c = a1 + a2

Check out the documentation for more.

  1. have you looked @ Expectations.jl?
    It allows you to compute expectations of transformations of random variables.
dist = Normal()
E = expectation(dist)
E(x -> x)
expectation(x->x^2, dist)
d = MixtureModel([Uniform(), Normal(), Gamma()]);
E = expectation(d);
E(x -> x) # 0.5000000000000016
  1. I think it would be amazing if Distributions.jl supported transformations of random variables.
    Mathematica is the only software I know that does this well currently.

    I really hope this becomes available in Distributions.jl.

Those are two distinct problems. Samples generation is super easy (you just define a rand), but inference will be tricker: for monotone transformations you just transform the likelihood and correct with the Jacobian, while for sums that don’t result in a “known” distribution you have to do something else, eg latent variables.

This is one of those problems that are easy to state but require a moderate amount of work to solve, specific to the problem. You should focus on the actual question you are trying to solve and describe that here, so that we can give more specific help.


Hi Tamas

that may be the kind of problem that is intriguing enough to try to solve…

Just to have some guidance here, the place to start looking into is Distributions.jl? And look at the semantics provided by Mathematica as @Albert_Zevelev suggests?

All the best


1 Like

@Marco_Antoniotti I have to make an important clarification.

Mathematica can handle transformations of a distribution when it can solve the problem symbolically.
In Julia things are usually done numerically, not symbolically.
Eg: Expectations.jl computes expectations numerically using quadrature.

For example consider a transformed Beta[4,3]. Mathematica can do all the following except it has no closed form for the entropy.

Now consider a transformed Beta[\alpha, \beta]:

I wrote a cheat-sheet which shows that Julia handles random variables much better than Matlab/R/STATA.
If we want Julia to deal w/ general transformations of random variables, we are probably gonna have to do this numerically. (but this means we an handle more transformations than Mathematica)

1 Like

Very nice summary @Albert_Zevelev . Thanks.
Yes, I agree that things should be done numerically in Julia (unless there were
any heavy C.A. lifters around; don’t look at me :slight_smile: )

All the best


1 Like

Generally the sum of two independent random variables only has a closed form in some special cases, so I am not sure what you are looking for here.

1 Like

You should look there! In some sense that is my research area. Assume you need to work with the conditional density (or posterior) p of X given X + Y which is not known in closed form. If you can find approximations , where you can compute the close form for the density of given X̃ + Ỹ, then you can use Monte Carlo methods to change samples from the conditional distribution of into the conditional distribution X that you are actually interested in. Almost used in all of my papers on arxiv: Moritz Schauer's articles on arXiv