 # Global optimization: Simulated Method of Moments

#1

I work with non-linear models that need to be calibrated to match data moments. The setup is simple. I have a function that takes a set of parameters as input (for example, a vector of floats), solves the model, and returns a measure of the distance between the model-generated moments and the data moments. Typically there are more moments than parameters. Hence, I use some simple weighting function to summarize the differences between model and data moments into one float. The objective is to minimize this distance.

Relevant for this problem:

• Each parameter needs to lie within a given interval.
• It is time-consuming to solve the model (it can take from several minutes to several hours).

My usual approach was to code the function that simulates the model in Fortran and then use MatLab’s genetic algorithm or direct search to find the parameters that minimize the distance between the model and data moments. The good thing about that approach is that it was incredibly easy to parallelize the MatLab optimization. I would like to do everything in Julia now.

I have looked at different options:

• `Optim` package. I think I could use the Nelder-Mead algorithm but I’m not sure if it’s possible/how to specify intervals for the parameters.
• `JuMP` package. Not sure if there is an option that could work for my case.

I appreciate any suggestion.

Thanks.

#2

Have you tried NLopt.jl?

2 Likes
#3

Optim should support constrains of the parameters:

http://julianlsolvers.github.io/Optim.jl/latest/#user/minimization/#box-constrained-optimization

You can also check BlackBoxOptim.jl.

Note that if a particular algorithm doesn’t support constrains you can always bake them into your error function, by adding a penalty term (e.g. exponential) if the parameter exceed the bound and then clamping it (such that your error function doesn’t crash).

You could also try to compute the gradient of your error function with one of the automatic differentiation package (e.g. ForwardDiff.jl).

That said several hours to compute the error function once is pretty scary.

2 Likes
#4

My experience is that doing that is quite inefficient.

#5

Though sometimes you can get away with treating them as deterministic, SMM/II problems are stochastic since the simulated data has a random element (unless you can get around this with common random variables), so the algorithms in Optim may fail.

For difficult problems, some people like

``````@article{chernozhukov2003mcmc,
title={An MCMC approach to classical estimation},
author={Chernozhukov, Victor and Hong, Han},
journal={Journal of Econometrics},
volume=115,
number=2,
pages={293--346},
year=2003,
publisher={Elsevier}
}
``````

which is trivial to implement, but my experience with it has been mixed, it is basically Metropolis-Hastings with quadratic cost and tuning problems. Bayesian optimization, eg

may dominate it, depending on the number of parameters. YMMV.

As for the parameter constraints: you can always do domain transformations, eg

(disclaimer: my package), and work on \mathbb{R}^n, or just return `-Inf` when outside the domain. Both become tricky when the good solutions pile up on the edge, which is usually an indicator for a problem with the model though.

1 Like
#6

I have some econometrics lecture notes that discuss estimating a DSGE model by GMM. It uses simulated annealing, with bounds on the parameters. In the past, I have estimated this model using SMM, using Octave, but the model was simulated using Dynare, and there’s no convenient substitute for Dynare for Julia, so far. The code is here: https://github.com/mcreel/Econometrics/tree/master/Examples/DSGE/GMM

The Chernozhukhov-Hong method mentioned by tpapp is in the https://github.com/mcreel/Econometrics/tree/master/Examples/DSGE/Bayesian directory.

For parallelizing simulations of a model, Julia gives a number of options. One that I have used and like is the MPI.jl package.

I will try to add some clean examples of SMM estimation to my notes when I have some time…

3 Likes
#7

A colleague told me that he had positive experience with indirect inference in a difficult model using genetic algorithms, specifically CMA-ES. Julia implementations seem dormant. If anyone has experience with this algorithm, I would be interested in learning the details.

1 Like
#8

I had some positive experience with CMA-ES in some truss design optimization problem. I was using Matlab at the time, but the algorithm seems to be implemented in https://github.com/wildart/Evolutionary.jl.

2 Likes