I’m trying to minimize parameters to fit an HMM. It works part of the way, but it seems to get stuck for some points, often not too distant from the initial guess.
If I examine the one-dimensional path of the objective function through each of the parameters (choosing the “solution” parameter and sweeping through the other), we clearly haven’t reached a minimum for either
minimum = 181.22345833012542
minimum = 183.6876674990469
If I take an approximate minimizing parameter for each of those graphs, the value of the objective function is 177.41975067999894, so this is a real improvement. And there’s no clear local minimum that seems to be present.
The problem does have some constraints on the parameters, where the objective function produces an Inf if that is violated. I tried Fminbox but it was quite a bit slower and seemed to treat the boundaries as soft rather than hard?
I’m not sure how to read the trace:
Iter Function value Gradient norm
0 1.873669e+02 4.310469e+02
* Current step size: 1.0
* time: 0.0
* g(x): [-21.477490384417457, -431.04693893543464, 74.67741374598157]
* x: [0.4782608695652174, 0.029177718832891244, 5.0e-5]
1 1.872548e+02 4.245517e+02
* Current step size: 5.884583215091771e-7
* time: 0.023000001907348633
* g(x): [-21.500864521498478, -424.5516786074998, 74.8087284569869]
* x: [0.4782735081731592, 0.029431371991068857, 6.055454452393342e-6]
2 1.872548e+02 4.245517e+02
* Current step size: 4.667441528647649e-19
* time: 0.04400014877319336
* g(x): [-21.500864521498478, -424.5516786074998, 74.8087284569869]
* x: [0.4782735081731592, 0.029431371991068857, 6.0554544523933395e-6]
I’m calling it with:
options = Optim.Options(x_tol = 1e-6, f_tol = 1e-6, iterations = 20000, f_calls_limit = 20000, show_trace = true, show_every = 1, extended_trace = true)
Optim.minimizer(optimize(f, starting_values, LBFGS(), options))
Would someone be able to help me understand what’s happening or how I can further debug?
(NLopt.jl produces the same solution FWIW)