Sampling from power posterior

Ah varinfo = DynamicPPL.VarInfo(vi, spl, z) typo. Second arg should be spl. It is not above.

1 Like

Oh thanks, I was not sure which ones I was supposed to replace but now it looks obvious!

1 Like

I think something like this ought to work in Soss:

using Soss

function power_posterior(m:ConditionalModel, yname)
    pr = prior(m, yname)
    lik = likelihood(m,yname)

    function f(x, t) 
        logdensity(pr, x) + t * logdensity(lik, x)
    end

    return f
end

That gets you the log-density, still need to wire things together for sampling

I am not familiar with Soss, what is yname here ?

yname is a Symbol indicating the name of the variable that will be observed. pr and lik will both be new models. There might well be some bugs in this, I havenā€™t tried it yet. But prior and likelihood already work.

Example:

julia> m = @model begin
           x ~ Normal()
           y ~ Normal(x,1) |> iid(3)
           return y
       end;

julia> prior(m, :y)
@model begin
        x ~ Normal()
    end


julia> likelihood(m, :y)
@model x begin
        y ~ Normal(x, 1) |> iid(3)
    end
1 Like

Ok I think I got it working. This uses the dev branch. I hope to have a merge to master soon, and a release shortly after that.

using Soss
using Soss: ConditionalModel

function power_posterior(cm::ConditionalModel)
    m = cm.model

    pr = prior(m, observed(cm)...)
    lik = likelihood(m, observed(cm)...)

    avs = argvals(cm)
    obs = observations(cm)
    function f(t, x) 
        logdensity(pr(avs), x) + t * logdensity(lik(merge(avs, x)) | obs)
    end

    return f
end

Example:

julia> m = @model begin
           x ~ Normal()
           y ~ Normal(x,1) |> iid(3)
           return y
       end;

julia> prior(m, :y)
@model begin
        x ~ Normal()
    end


julia> likelihood(m, :y)
@model x begin
        y ~ Normal(x, 1) |> iid(3)
    end


julia> y = rand(m())
3-element Vector{Float64}:
 -1.6864413481805354
 -0.8669993609594797
  0.2291034530479168

julia> f = power_posterior(m() | (;y))
f (generic function with 1 method)

julia> f(1.0, (x=0.2,))
-2.368997803696223

julia> f(0.0, (x=0.2,))
-0.020000000000000004
1 Like

Interestingly, itā€™s only been through coding this up that Iā€™m seeing some connections with other methods. For example, in a Bayesian bootstrap, instead of one weight for the log-likelihood overall, thereā€™s a different weight for each term, so they donā€™t come outside the sum. But algebraically itā€™s very similar.

Alright, the code seems to be up and running for Turing models (at least a simple one :smiley: )
: GitHub - theogf/ThermodynamicIntegration.jl: Thermodynamic Integration for Turing models and more
The package should be registered in 3 days.
The only stupid thing I could not figure out is how to sample from the prior itself (since the search bar of Turing.ml does not work anymore, itā€™s really hard to explore the docs)

@cscherrer Tell me when this is merged Iā€™ll try to write an interface for Soss models.

2 Likes