I work with non-linear models that need to be calibrated to match data moments. The setup is simple. I have a function that takes a set of parameters as input (for example, a vector of floats), solves the model, and returns a measure of the distance between the model-generated moments and the data moments. Typically there are more moments than parameters. Hence, I use some simple weighting function to summarize the differences between model and data moments into one float. The objective is to minimize this distance.

Relevant for this problem:

Each parameter needs to lie within a given interval.

It is time-consuming to solve the model (it can take from several minutes to several hours).

My usual approach was to code the function that simulates the model in Fortran and then use MatLabās genetic algorithm or direct search to find the parameters that minimize the distance between the model and data moments. The good thing about that approach is that it was incredibly easy to parallelize the MatLab optimization. I would like to do everything in Julia now.

I have looked at different options:

Optim package. I think I could use the Nelder-Mead algorithm but Iām not sure if itās possible/how to specify intervals for the parameters.

JuMP package. Not sure if there is an option that could work for my case.

Note that if a particular algorithm doesnāt support constrains you can always bake them into your error function, by adding a penalty term (e.g. exponential) if the parameter exceed the bound and then clamping it (such that your error function doesnāt crash).

You could also try to compute the gradient of your error function with one of the automatic differentiation package (e.g. ForwardDiff.jl).

That said several hours to compute the error function once is pretty scary.

Though sometimes you can get away with treating them as deterministic, SMM/II problems are stochastic since the simulated data has a random element (unless you can get around this with common random variables), so the algorithms in Optim may fail.

For difficult problems, some people like

@article{chernozhukov2003mcmc,
title={An MCMC approach to classical estimation},
author={Chernozhukov, Victor and Hong, Han},
journal={Journal of Econometrics},
volume=115,
number=2,
pages={293--346},
year=2003,
publisher={Elsevier}
}

which is trivial to implement, but my experience with it has been mixed, it is basically Metropolis-Hastings with quadratic cost and tuning problems. Bayesian optimization, eg

may dominate it, depending on the number of parameters. YMMV.

As for the parameter constraints: you can always do domain transformations, eg

(disclaimer: my package), and work on \mathbb{R}^n, or just return -Inf when outside the domain. Both become tricky when the good solutions pile up on the edge, which is usually an indicator for a problem with the model though.

I have some econometrics lecture notes that discuss estimating a DSGE model by GMM. It uses simulated annealing, with bounds on the parameters. In the past, I have estimated this model using SMM, using Octave, but the model was simulated using Dynare, and thereās no convenient substitute for Dynare for Julia, so far. The code is here: https://github.com/mcreel/Econometrics/tree/master/Examples/DSGE/GMM

A colleague told me that he had positive experience with indirect inference in a difficult model using genetic algorithms, specifically CMA-ES. Julia implementations seem dormant. If anyone has experience with this algorithm, I would be interested in learning the details.

I had some positive experience with CMA-ES in some truss design optimization problem. I was using Matlab at the time, but the algorithm seems to be implemented in https://github.com/wildart/Evolutionary.jl.

CMAES works really well - I have used the Haskell version (which uses the Python version under the covers) in two commercial projects. Itās a shame there isnāt a maintained Julia version.

I got around to adding an example of simulated moments estimation to my econometrics notes. The example estimates a simple discrete time stochastic volatility model using moments from an auxiliary model (i.e., it is indirect inference). The version of GMM is the continuously updating estimator (CUE). The example code is here: https://github.com/mcreel/Econometrics/blob/master/Examples/SBEM/EstimateStochasticVolatilityModel.jl

This code requires a little unregistered package https://github.com/mcreel/SV which contains the functions that define moments. The simulations being independent of one another, it is natural to use threads to speed them up.