Hi to everybody!

I have been using AdvancedHMC NUTS implementation and I am quite satisfied with the result I am obtaining.

The model I am trying to study is quite simple

```
logℓ(θ) = logpdf(MvNormal(data, mycov), theory(θ))
```

where `theory(θ)`

is a function that computes the theoretical prediction of my model given the value of the parameters `θ`

.

Now I’d like to put priors on some parameters. For some of them I am using a `Normal`

, but for some of them I’d like to put a `Uniform`

distribution.

As far as I understand, samplers such as NUTS require that the likelihood has a support on ℝ. This issue can be solved using Bijectors.jl.

For instance, let’s say I want to put the following prior

```
Uniform(-2,2)
```

Then, using `Bijectors`

, I can define

```
b = bijector(Uniform(-2,2))
b⁻¹ = Bijectors.Inverse(b)
```

Let’s assume for simplicity the I have a single parameter for my model. Then I can write

```
logℓ(θ) = logpdf(MvNormal(data, mycov), theory(b⁻¹ (θ)))+logpdf(Uniform(-2,2), b⁻¹ (θ))
```

Is this right of I am doing something wrong?

Thank you,

Marco

You might consider just shelling out for a PPL, which for your description above is super simple, in pseudo-code could be something like:

```
using Turing
@model function mymodel()
θ ~ Uniform(-2, 2)
d ~ MvNormal(theory(θ), mycov)
end
problem = condition(mymodel(), (d = mydata,))
samples = sample(problem, HMC())
```

and Turing (or whatever library you choose, like Soss, Tilde…) will handle the bijections for you (including jacobian factors which I’m not sure you had) and you can play with the prior without much thinking.

1 Like

Hi @marius311 ! Thank you for your answer:)

Yeah, for sure I have to add the jacobian to account for the change of variable

Also, I have already tried using Turing as you suggested and it kind of works.

My only problem regards some details.

For instance, when reading Turing API, it looks that several options that I can find in AdvancedHMC, are not present. For instance, I’d like to define the mass matrix using some previously computed Fisher Matrix, specify which kind of adaptation algorithm I am gonna use…

Probably I am missing something, but I am not able to do the same within Turing…

1 Like

I’ll give it a check tomorrow. Now I am too much tired

Thank you again for your suggestions…I think that I’ll also give a look inside Turing repo.

Also: one of the things that makes me wonder if this is possible, is that in pathfinder @sethaxen writes

To use Pathfinder’s estimate of the metric and skip warm-up, at the moment one needs to use AdvancedHMC directly.

But maybe it’s better to check tomorrow😅

When using `Bijectors`

you will need to add the Jacobian correction. For a single parameter, sampling from a uniform distribution could be done as follows:

```
function logL(θu) # θu is unconstrained parameter
pd = Uniform(-2, 2) # desired density on constrained space
bu2c = Inverse(bijector(pd)) # bijector from unconstrained to constraint space
θc, ladj = with_logabsdet_jacobian(bu2c, θu[1]) # get constrained parameter θc and log Jacobian
logpdf(pd, θc) + ladj # corrected log density
end
```

What is not so nice about the approach is that the returned samples, i.e., via `AdvancedHMC`

, live on the unconstrained space, i.e., you may want to use the bijector `bu2c`

on them before plotting.

The following rule of thumb might be helpful to remember if the log Jacobian needs to be added or subtracted (which I tend to forget):

p(c) \; dc = p(u) \; du \implies p(u) = p(c) \; \frac{dc}{du},

i.e., the density on the unconstrained space p(u) - as required by the sampler - is the density on the constrained space p(c) - naturally defined in the model - times the Jacobian of the bijector from the unconstrained to the constrained space.

1 Like

No, currently it’s not possible to set the mass matrix through the Turing interface. This PR will eventually solve that, though currently it’s delayed by some AdvancedHMC refactors I haven’t had time for: Allow more AdvancedHMC options in HMC types by sethaxen · Pull Request #1818 · TuringLang/Turing.jl · GitHub

1 Like