Global optimization: Simulated Method of Moments

I work with non-linear models that need to be calibrated to match data moments. The setup is simple. I have a function that takes a set of parameters as input (for example, a vector of floats), solves the model, and returns a measure of the distance between the model-generated moments and the data moments. Typically there are more moments than parameters. Hence, I use some simple weighting function to summarize the differences between model and data moments into one float. The objective is to minimize this distance.

Relevant for this problem:

  • Each parameter needs to lie within a given interval.
  • It is time-consuming to solve the model (it can take from several minutes to several hours).

My usual approach was to code the function that simulates the model in Fortran and then use MatLab’s genetic algorithm or direct search to find the parameters that minimize the distance between the model and data moments. The good thing about that approach is that it was incredibly easy to parallelize the MatLab optimization. I would like to do everything in Julia now.

I have looked at different options:

  • Optim package. I think I could use the Nelder-Mead algorithm but I’m not sure if it’s possible/how to specify intervals for the parameters.
  • JuMP package. Not sure if there is an option that could work for my case.

I appreciate any suggestion.

Thanks.

1 Like

Have you tried NLopt.jl?

3 Likes

Optim should support constrains of the parameters:

You can also check BlackBoxOptim.jl.

Note that if a particular algorithm doesn’t support constrains you can always bake them into your error function, by adding a penalty term (e.g. exponential) if the parameter exceed the bound and then clamping it (such that your error function doesn’t crash).

You could also try to compute the gradient of your error function with one of the automatic differentiation package (e.g. ForwardDiff.jl).

That said several hours to compute the error function once is pretty scary.

2 Likes

My experience is that doing that is quite inefficient.

Though sometimes you can get away with treating them as deterministic, SMM/II problems are stochastic since the simulated data has a random element (unless you can get around this with common random variables), so the algorithms in Optim may fail.

For difficult problems, some people like

@article{chernozhukov2003mcmc,
  title={An MCMC approach to classical estimation},
  author={Chernozhukov, Victor and Hong, Han},
  journal={Journal of Econometrics},
  volume=115,
  number=2,
  pages={293--346},
  year=2003,
  publisher={Elsevier}
}

which is trivial to implement, but my experience with it has been mixed, it is basically Metropolis-Hastings with quadratic cost and tuning problems. Bayesian optimization, eg

https://github.com/jbrea/BayesianOptimization.jl

may dominate it, depending on the number of parameters. YMMV.

As for the parameter constraints: you can always do domain transformations, eg

https://github.com/tpapp/TransformVariables.jl

(disclaimer: my package), and work on \mathbb{R}^n, or just return -Inf when outside the domain. Both become tricky when the good solutions pile up on the edge, which is usually an indicator for a problem with the model though.

1 Like

I have some econometrics lecture notes that discuss estimating a DSGE model by GMM. It uses simulated annealing, with bounds on the parameters. In the past, I have estimated this model using SMM, using Octave, but the model was simulated using Dynare, and there’s no convenient substitute for Dynare for Julia, so far. The code is here: https://github.com/mcreel/Econometrics/tree/master/Examples/DSGE/GMM

The Chernozhukhov-Hong method mentioned by tpapp is in the https://github.com/mcreel/Econometrics/tree/master/Examples/DSGE/Bayesian directory.

For parallelizing simulations of a model, Julia gives a number of options. One that I have used and like is the MPI.jl package.

I will try to add some clean examples of SMM estimation to my notes when I have some time…

4 Likes

A colleague told me that he had positive experience with indirect inference in a difficult model using genetic algorithms, specifically CMA-ES. Julia implementations seem dormant. If anyone has experience with this algorithm, I would be interested in learning the details.

2 Likes

I had some positive experience with CMA-ES in some truss design optimization problem. I was using Matlab at the time, but the algorithm seems to be implemented in GitHub - wildart/Evolutionary.jl: Evolutionary & genetic algorithms for Julia.

2 Likes

CMAES works really well - I have used the Haskell version (which uses the Python version under the covers) in two commercial projects. It’s a shame there isn’t a maintained Julia version.

I got around to adding an example of simulated moments estimation to my econometrics notes. The example estimates a simple discrete time stochastic volatility model using moments from an auxiliary model (i.e., it is indirect inference). The version of GMM is the continuously updating estimator (CUE). The example code is here: https://github.com/mcreel/Econometrics/blob/master/Examples/SBEM/EstimateStochasticVolatilityModel.jl

This code requires a little unregistered package https://github.com/mcreel/SV which contains the functions that define moments. The simulations being independent of one another, it is natural to use threads to speed them up.

1 Like