Initializing every element of an array with a different distribution

Hello, I have a Turing model which takes an array of unknown dimensions, and I want every element of that array to take on a different distribution based on the index. However, I’m having trouble making this work. Here’s a minimal example I’m not able to make work:

``````@model tr(x) = begin
x = Array{Any,1}(missing,2)
x[1]  ~ Normal()
x[2] ~ Normal()
end

model = tr(missing)
model()
``````

I would expect that the output would be the x vector, but instead of a pair of numbers, I’m getting only one number, like so:

``````model()
-1.8639815507482753
``````

What am I doing wrong, what’s the right way to do this? Note: I don’t want to declare

``````x1 ~ Normal()
x2 ~ Normal()
x = [x1, x2]
``````

That will serve the purpose for this toy example, but when the dimensions of the array are unknown, and the distribution of the i’th element depends on i, I can’t use that.

Ok I managed to figure this out myself. I just drop the x from tr(x), and return x.

``````@model tr() = begin
x = Array{Any,1}(missing,2)
x[1]  ~ Normal()
x[2] ~ Normal()
return x
end
``````

That seems to work.