Using Optim.jl but failing to find minimum

I’m trying to minimize parameters to fit an HMM. It works part of the way, but it seems to get stuck for some points, often not too distant from the initial guess.

If I examine the one-dimensional path of the objective function through each of the parameters (choosing the “solution” parameter and sweeping through the other), we clearly haven’t reached a minimum for either

image
minimum = 181.22345833012542
image
minimum = 183.6876674990469

If I take an approximate minimizing parameter for each of those graphs, the value of the objective function is 177.41975067999894, so this is a real improvement. And there’s no clear local minimum that seems to be present.

The problem does have some constraints on the parameters, where the objective function produces an Inf if that is violated. I tried Fminbox but it was quite a bit slower and seemed to treat the boundaries as soft rather than hard?

I’m not sure how to read the trace:

Iter     Function value   Gradient norm
     0     1.873669e+02     4.310469e+02
 * Current step size: 1.0
 * time: 0.0
 * g(x): [-21.477490384417457, -431.04693893543464, 74.67741374598157]    
 * x: [0.4782608695652174, 0.029177718832891244, 5.0e-5]
     1     1.872548e+02     4.245517e+02
 * Current step size: 5.884583215091771e-7
 * time: 0.023000001907348633
 * g(x): [-21.500864521498478, -424.5516786074998, 74.8087284569869]      
 * x: [0.4782735081731592, 0.029431371991068857, 6.055454452393342e-6]    
     2     1.872548e+02     4.245517e+02
 * Current step size: 4.667441528647649e-19
 * time: 0.04400014877319336
 * g(x): [-21.500864521498478, -424.5516786074998, 74.8087284569869]      
 * x: [0.4782735081731592, 0.029431371991068857, 6.0554544523933395e-6] 

I’m calling it with:

options = Optim.Options(x_tol = 1e-6, f_tol = 1e-6, iterations = 20000, f_calls_limit = 20000, show_trace = true, show_every = 1, extended_trace = true)
Optim.minimizer(optimize(f, starting_values, LBFGS(), options))

Would someone be able to help me understand what’s happening or how I can further debug?

(NLopt.jl produces the same solution FWIW)

Hmm… maybe it’s an algorithm choice issue.

Using Nelder-Mead in NLopt works well. Guess I’ll need to poke a bit more.

Still, would be nice to know why LBFGS fails here.

Do you have a reproducible example? Is y1 your f function?

What did it terminate? It seems like the step size got very small.

Unfortunately it’s a bit complicated to share. Under what conditions does it reduce the step size so much?

I don’t know. It’s hard to say anything without a reproducible example.

where the objective function produces an Inf if that is violated.

I overlooked this initially. That would suggest that your problem is not smooth or differentiable? In which case, you’re violating the assumptions of BFGS.

2 Likes

Two questions:

  • you did a sweep to look for the minimum. Is the minimizer feasible?
  • have you tried constrained optimization methods instead of penalizing infeasible points?
  • Yes, Nelder-Mead finds the expected minimum and it’s generally a smooth function (though lots of bumpiness due to numerical issues)
  • Fminbox-wrapped LBFGS was too slow

I settled on Nelder-Mead which seems to work well, so I’ll just stick with that. I’ll do some more reading on the rest. Thanks for the assistance!