Is there a Julia equivalent of scipy.optimize.minimize(method='TNC')?

I have defined objgrad! just like you show here (although the obj and grad! are also defined, I don’t bother to remove them). Note the difference of outputs of these two options though.

Sorry, I didn’t notice that. I recreated the issue here: https://github.com/JuliaSmoothOptimizers/NLPModels.jl/issues/304. Wrapping your model in LBFGSModel breaks the use of objgrad. This should be easy to fix. Thanks for finding it!

1 Like

NLPModels 0.8.2 is released and should help with this. Let me know if it works now.

2 Likes

I have tested the new version NLPModels v0.13.2. It indeed reduces the number of evaluations to half!

In a final summary, for the same tolerance of the gradients (g_tol and outer_g_tol = 1e-4 for Optim.jl and atol = 1e-4 for JSOSolvers), to optimize my problem, JSOSolvers requires 12 evaluations and Optim.jl requires 24 evaluations.

1 Like