Can we print likelihood values while sampling by NUTS() in turing.jl? Is there any way ? So, we don’t sit blind during run and have some info.

Do something like the following:

```
using Distributions, Turing, StatsPlots
function main(n)
# sample is from exponential, prior is lognormal
y = rand(Exponential(3.0), n)
# the model: prior and likelihood
@model function ExpModel(y)
θ ~ LogNormal(1.,1.) # the prior
y ~ Exponential(θ) # the likelihood
@show loglikelihood(Exponential(θ), y)
end
# get the chain
chain = sample(ExpModel(y), NUTS(200, 0.65), 1000)
end
## run it
n = 300
chain = main(n)
## examine and plot last bit of chain chain
display(chain)
plot(chain[end-500:end])
```

I just explained this example in class earlier today, so I had it in mind! The line with @show is what displays the loglikelihood. Doing that might slow things down, I haven’t checked.

Thank you for your reply and example

It shows loglikelihood but its more like just printing everything. I was looking for something which be more like updating loglikelihood lets say after every x = 100 . So, for your example it should only print 3 times for n=300

I suppose that would require modifying `sample`

from Turing. It might be a nice issue to raise, I can see it being useful for long running samples that might be straying into problems.

1 Like