Puzzling on a product likelihood using Distributions and Turing

I am currently working on a simple bayesian problem in which my likelihood is the product of N poisson distributions, so I got the following: likelihood(x) ~ product_i(Poisson(mean_i)) where x is a vector, representing the variables, and the product on the other side run over the dimension (from i to N). mean_i is the mean associated to the variables x_i in the vector x.

As far as I have understood Turing and Distributions, I think that the following is the ‘correct’ code part for the likelihood:

vector_x = [x for x in variables]
vector_x ~ Product([Poisson(mean_i) for mean_i in means]

where means is a vector containing the various means associated to each poisson process. Is it correct? :smiley:
I am asking because I am cross-checking the analysis done by other people and the previous code generates a result, not in accordance with the other results.

Welcome, @riegel_gestr!

Not sure if I understood your problem correctly but if you just want to model your x’s as iid Poisson draws, then a Turing model could look like this:

using Turing

# simulate fake data
lambda = 3
n = 100
x = rand.(Poisson.(lambda), n)

#define Turing model
@model function mymodel(x)
    # prior
    lambda ~ LogNormal(1.5)
    # likelihood
    for i in eachindex(x)
        x[i] ~ Poisson(lambda)

# sample from posterior
post = sample(mymodel(x), NUTS(), 2000)

You could of course also specify lambda as some function of additional predictor variables, in which case you get a Poisson regression. Turing internally accumulates the log likelihood so you don’t have to specify the product over the pointwise likelihoods (or rather the sum over the pointwise log likelihoods).

I feel so bad that I am answering after, almost, a month. I have tried your solution and now I get the same trend of the data that I was cross-checking, but the values are still different. Probably the origin of this difference is in other part of the code or somewhere else.
Really thank you for your help (still sorry for being too late)