Efficient vectorization in Turing.jl

I have a fairly simple hierarchical model where I have 2 population parameters a,b. b is also an individual level parameters such that the b_i are sampled from a distribution parametrized by b. a and b_i govern the shape of an individual-level function f(t; a, b_i) from which the individual-level data is sampled using a Bernoulli for each timepoint t.

What’s the best way to vectorize this? Since the b_i are i.i.d., filldist should do the job. However, the data for each cell is not i.i.d. (since f(t; a,b_i) is different for each individual), so it seems like I need arraydist here.

So the model boils down to:

a_population ~ P(.)
b_population ~ Q(.)
b_individual ~ filldist(Q(.;b), num_individuals)
individual_responses(a,b) ~ arraydist(map(t-> Bernoulli(f(t;a,b)),1:num_timepoints)
population_responses ~ arraydist(map(i->individual_responses(a,b_individual[i]),1:num_cells))

where I’m using P(.) and Q(.) to denote some arbitrary continuous valued distributions.
Is using map here efficient? I’m assuming it’s better than array comprehension but I’m not sure how well Turing deals with it. All the parameters are continuous, so I’m hoping to use NUTS with reverse differentiation. The full model will have more parameters but this is a minimal example.