Extract Turing model to LogDensityProblems and transform variables

Hi all,
As some sort of continuation of Recommended way to extract logprob and build gradients in Turing.jl?
I wish to extract the log probability density from a Turing model (to use with Pathfider.jl for example),
and I encountered the LogDensityFunction implementation of DynamicPPL, which is

f = LogDensityFunction(model) 

This reexports the model within the LogDensityProblems interface.
However, is it possible to transform the model variables to unconstrained Euclidean space beforehand, so that MCMC can be performed without manually transforming each parameter using TransformVariables.jl?

Iโ€™ve stumbled upon the link!! function from DynamicPPL, but Iโ€™ve got no clue how to implement it.

You can use Turing directly with Pathfinder, similar to the examples in Turing usage ยท Pathfinder.jl

But this should also work for you (adapted from https://github.com/TuringLang/Turing.jl/blob/4affc28b341f4763bd1abc8523e4e209e9f6aa6e/src/contrib/inference/abstractmcmc.jl#L38-L48):


julia> using DynamicPPL, LogDensityProblems, LogDensityProblemsAD, ForwardDiff, Distributions

julia> @model function foo()
           x ~ Gamma()
       end;

julia> model = foo();

julia> โ„“ = DynamicPPL.LogDensityFunction(model);

julia> LogDensityProblems.logdensity(โ„“, [-10.0])  # constrained
-Inf

julia> DynamicPPL.link!!(โ„“.varinfo, model);

julia> LogDensityProblems.logdensity(โ„“, [-10.0])  # unconstrained
-10.000045399929762

julia> โ„“_with_grad = LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), โ„“);

julia> using Pathfinder

julia> pfr = pathfinder(โ„“_with_grad);
3 Likes

Thanks! works like a charm :slightly_smiling_face: