I’m happy to announce that the package SimulatedNeuralMoments is available in the general registry. This is a package for inference methods that can be thought of as approximate Bayesian computing (ABC), with a particular choice of criterion, or as a method of simulated moments (MSM) estimator, using Bayesian tools.
An important feature of the methods is that the statistics used to identify the parameters are filtered through a neural net. This process is automatic and requires no intervention by the user. Monte Carlo evidence has shown, so far, that the methods lead to confidence/credible intervals that have proper coverage, with sample sizes representative of real data.
An example showing how to use the methods with real data has been added. It shows how to estimate a simple stochastic volatility model. There is an explanation at https://github.com/mcreel/SimulatedNeuralMoments.jl/blob/main/examples/SV/README.md
Everything runs fine with Julia 1.5.3 or 1.6 beta1.
A new example that estimates a small DSGE model has been added: SimulatedNeuralMoments.jl/examples/DSGE at main · mcreel/SimulatedNeuralMoments.jl · GitHub
This makes use of the SolveDSGE.jl package, which I find to be very nice and straightforward to use (and fast!) for solving and simulating DSGE models.
Version v1.0.1 of this package has been released (thanks to @giordano for solving the mess I made trying to register v1.0.0). This version uses Turing for sampling, and the two examples, for a mixture of normals model and a stochastic volatility model, show how approximate Bayesian computing (ABC) / method of simulated moments (MSM) may be done using Turing.
The sampling is done using AdvancedMH. People sometimes ask why not NUTS. The reason is that the likelihood is computed by simulation, and it is not continuous or differentiable. However, the neural net fit gives a very good starting value for the chain, and a very good covariance estimator for a random walk proposal, so sampling is effective.