The README for the project has a worked example for a mixture of normals model.
The mixture of normals model (see the file MNlib.jl for details) draws statistics using the function
function auxstat(θ)
n = 1000
μ_1, μ_2, σ_1, σ_2, prob = θ
d1=randn(n).*σ_1 .+ μ_1
d2=randn(n).*σ_2 .+ μ_2
ps=rand(n).<prob
data=zeros(n)
data[ps].=d1[ps]
data[.!ps].=d2[.!ps]
r=0:0.1:1
sqrt(Float64(n)).* quantile.(Ref(data),r)
end
So, there are five parameters, and 11 summary statistics. Samples of 1000 observations are used to compute the statistics. The “true” parameter values we will use to evaluate performance and confidence interval coverage are from
function TrueParameters()
[1.0, 0.0, 0.2, 2.0, 0.4]
end
In a Monte Carlo study where this model is estimated 1000 times, summary statistics for the 1000 replications, and confidence interval coverage for the true parameters are
In the first table, it can be seen that the parameters are estimated with little bias, and good precision. In the second table, one can see that confidence interval coverage is quite close to the nominal values, or each of the five parameters.
Here’s an example of a posterior density plot for the first parameter (true value is 1.0), from a single estimation:
So far, there are four example models, all of which are quite simple, and the results illustrated in the README for the mixture of normals model also hold, qualitatively, for these other models. I expect to have results for a more realistic jump diffusion model before too long.