I just gave it a try with a toy exponential mixture model. It works well and produces similar results to ApproxBayes.jl’s ABCSMC, but seems to be bit less sample-efficient.
When using the same \epsilon, KissABC.jl uses about 4x more samples, but produces a posterior with tighter 95% intervals (I’m guessing this is related to ApproxBayes.jl stopping early because of the default tolerance). When I increase the \epsilon for KissABC.jl to get a similar posterior, it still requires 2.5x more samples than ApproxBayes.jl.
Would you expect better results? Is it just a case of not working as well on this specific model?
function sim((u1, p1), params; n=10^6, raw=false)
u2 = (1.0 - u1*p1)/(1.0 - p1)
x = randexp(n) .* ifelse.(rand(n) .< p1, u1, u2)
raw && return x
[std(x), median(x)]
end
function dist(s, s0)
sqrt(sum(((s .- s0)./s).^2))
end
ABCSMCPR(Factored(Uniform(0,1), Uniform(0.5,1)), sim, [2.2, 0.4], dist, 0.01, nparticles=100, parall
el=true)