Implementing model where data is a function of random variables

Hi,

I’ve got a question regarding implementation of a model where an observed variable can be described as a function of other random variables. My issues here are perhaps just due to my inexperience in using probabilistic programming rather than a problem specific to Turing, so any help is appreciated. In particular I have a problem where an observed quantity I is given by a function of the form:

I=A\cos(\phi)+B

where \phi\sim \mathcal{N}(\mu_1,\sigma_1) and \phi\sim \mathcal{N}(\mu_2,\sigma_2). Model parameters are then A,\mu_1,\sigma_1,\mu_2,\sigma_2. I could in principle derive the distribution for I, I\sim\mathcal{F}(A,\mu_1,\sigma_1,\mu_2,\sigma_2), and construct a Turing model like

@model function offset_model(y)
	A ~ Uniform(0,100)
	mu1 ~ Uniform(0,100)
	sigma1 ~ Uniform(0,100)
	mu2 ~ Uniform(0,100)
	sigma2 ~ Uniform(0,100)
    # The number of observations.
    N = length(y)
    for n in 1:N
        y[n] ~ F(A,mu1,sigma1,mu2,sigma2)
    end
end;

And using this, for a set of observations of I, I could then draw the posteriors for my model parameters.

But I’m wondering if there is some way to do this automatically in julia+turing without having to derive what would be a pretty complex distribution function for I. In particular, a flexible way to specify things would be useful as \phi or B can potentially follow different distributions, and having to derive the corresponding distributions for I can be very cumbersome.

As said before, any help is greatly appreciated!

You could possibly use some functionality from MixedModels.jl Documentation · MixedModels

Is there a notational error? Or, if not, what does it mean that \phi is drawn from two different distributions here?

if you allow a small error term, then you can try to do something like this:

y[i] ~ normal(A*cos(phi)+B, small_error)

I suppose maybe I is typicall O(1) or some such thing, so then let small_error be something like .001 to get started and see what happens.

1 Like

Oh, yes, that was dumb of me. I have that \phi\sim \mathcal{N}(\mu_1,\sigma_1) and B\sim \mathcal{N}(\mu_2,\sigma_2) (so the second normal is for B, not \phi).

Did not know about MixedModels, thanks for pointing it out! This actually helps me realize that what I’m trying to do is to construct a non-linear mixed effects model, which upon a bit of inspection turns out to be a much more complex problem than I thought at first.