[ANN]¯SNM: simulated neural moments for ABC or Bayesian method of moments

SNM is a project to reduce the dimension of statistics used for Approximate Bayesian Computing or the method of simulated moments though use of neural nets. The project allows for creation and training of the neural net, and for calculation of the neural moments, given the trained net. It also provides the large sample indirect likelihood function of the neural moments, which can be used to sample from the posterior, using MCMC (simple version provided in the project), SMC (not provided), or other methods.

For four simple examples, the methods have been found to give confidence intervals with quite good coverage, as well as estimators with low bias and RMSE.

Comments, improvements, etc., are welcome!

7 Likes

The README for the project has a worked example for a mixture of normals model.

The mixture of normals model (see the file MNlib.jl for details) draws statistics using the function

function auxstat(θ)
    n = 1000
    μ_1, μ_2, σ_1, σ_2, prob = θ
    d1=randn(n).*σ_1 .+ μ_1
    d2=randn(n).*σ_2 .+ μ_2
    ps=rand(n).<prob
    data=zeros(n)
    data[ps].=d1[ps]
    data[.!ps].=d2[.!ps]
    r=0:0.1:1
    sqrt(Float64(n)).* quantile.(Ref(data),r)
end

So, there are five parameters, and 11 summary statistics. Samples of 1000 observations are used to compute the statistics. The “true” parameter values we will use to evaluate performance and confidence interval coverage are from

function TrueParameters()
    [1.0, 0.0, 0.2, 2.0, 0.4]
end

In a Monte Carlo study where this model is estimated 1000 times, summary statistics for the 1000 replications, and confidence interval coverage for the true parameters are


In the first table, it can be seen that the parameters are estimated with little bias, and good precision. In the second table, one can see that confidence interval coverage is quite close to the nominal values, or each of the five parameters.

Here’s an example of a posterior density plot for the first parameter (true value is 1.0), from a single estimation:
MNp1

So far, there are four example models, all of which are quite simple, and the results illustrated in the README for the mixture of normals model also hold, qualitatively, for these other models. I expect to have results for a more realistic jump diffusion model before too long.

2 Likes

The archive has been updated with new results for a small DSGE (dynamic stochastic general equilibrium) model. This was possible thanks to the SolveDSGE.jl package. There are also results for a jump diffusion model estimated using returns of the SP500 index. These results were made possible by the DifferentialEquations.jl package. For all of the test models for which Monte Carlo was done, the confidence interval coverage is very good when the continuous updating (CUE) version of the “log likelihood” of the statistics is used.

3 Likes