Optim.jl not honoring convergence criteria

I am trying to incorporate an early stopping to my code using Optim.jl but the specifications are not followed and I was hoping someone might know why. I have an example of this here.

function toy(X)
x, y = X
cost = (x - 3) ^ 2 + (y - 5) ^ 2
@info cost
return cost
end

I’ve run into this before as well. basically I guess these constraints are treated more like fuzzy penalties than they are firm boundaries

Oh thanks, Adienes. Did you ever figure out a way to force the constraints to work? or did you have to use another package that obeys the constraints specification?

Have you considered modelling your problem using JuMP, which include support for nonlinear models?

1 Like

Hi @Gbenga_Fabusola, welcome to the forum.

There are actually a few different options.

I think you’re actually after outer_f_reltol

Optim.Options(; outer_f_reltol = 1000)

You can see the options by looking at the help (I don’t know if this is well documented online):

help?> Optim.Options
  Configurable options with defaults (values 0 and NaN indicate unlimited):

  x_abstol::Real = 0.0,
  x_reltol::Real = 0.0,
  f_abstol::Real = 0.0,
  f_reltol::Real = 0.0,
  g_abstol::Real = 1e-8,
  g_reltol::Real = 1e-8,
  outer_x_abstol::Real = 0.0,
  outer_x_reltol::Real = 0.0,
  outer_f_abstol::Real = 0.0,
  outer_f_reltol::Real = 0.0,
  outer_g_abstol::Real = 1e-8,
  outer_g_reltol::Real = 1e-8,
  f_calls_limit::Int = 0,
  g_calls_limit::Int = 0,
  h_calls_limit::Int = 0,
  allow_f_increases::Bool = true,
  allow_outer_f_increases::Bool = true,
  successive_f_tol::Int = 1,
  iterations::Int = 1_000,
  outer_iterations::Int = 1000,
  store_trace::Bool = false,
  show_trace::Bool = false,
  extended_trace::Bool = false,
  show_every::Int = 1,
  callback = nothing,
  time_limit = NaN

  See http://julianlsolvers.github.io/Optim.jl/stable/#user/config/

Hi, Unfortunately, outer_f_reltol did not have the desired effect. just g_abstol worked to some degree but I want to be able to stop the algorithm based on the the result of the loss of each iteration. I am still trying to figure out ways to solve this problem.

One issue is that Optim’s algorithm calls your function several times per iteration, but checks convergence only once per iteration, so what you see when you print from your cost function isn’t exactly what Optim’s look at for convergence.

If you hack into Optim with this you can see the function value is already 5 the first time convergence is assessed :

function toy(X)
    x, y = X
    cost = (x - 3) ^ 2 + (y - 5) ^ 2
    @info "calling function : $cost"

    return cost
end

function custom_stopping(x)
    @info "check for custom stopping"
    return false
end

@eval Optim begin
    function assess_convergence(x, x_previous, f_x, f_x_previous, gx, x_abstol, x_reltol, f_abstol, f_reltol, g_abstol)  
        @info "assessing convergence"
        f_x < 1000  && return true, false, false, false
        return false, false, false, false
    end
end

optimize(
    toy, [0,0], [Inf,Inf], [10., 50.],  Optim.Fminbox(),
    Optim.Options(callback = custom_stopping)
)
[ Info: calling function : 2074.00084776729
[ Info: calling function : 2073.9991522400437
[ Info: calling function : 2074.027249636707
[ Info: calling function : 2073.9727505466353
[ Info: calling function : 2074.0
[ Info: check for custom stopping
[ Info: calling function : 1373.0575397465902
[ Info: calling function : 1373.0561782060774
[ Info: calling function : 1373.0752432895974
[ Info: calling function : 1373.0384747834933
[ Info: calling function : 1373.0568589732666
[ Info: calling function : 14.784521316821616
[ Info: calling function : 14.784142416634552
[ Info: calling function : 14.784584618784134
[ Info: calling function : 14.784079116620969
[ Info: calling function : 14.784331865524148
[ Info: calling function : 11.474314941709697
[ Info: calling function : 11.47403851522763
[ Info: calling function : 11.474101109876269
[ Info: calling function : 11.474252345510765
[ Info: calling function : 11.474176727478573
[ Info: calling function : 5.990978973840084
[ Info: calling function : 5.990656542939395
[ Info: calling function : 5.990810840734959
[ Info: calling function : 5.990824675619014
[ Info: calling function : 5.990817757302663
[ Info: assessing convergence
[ Info: check for custom stopping
[ Info: assessing convergence
[ Info: calling function : 5.990817757302663
1 Like