Optim.jl --- Do all Methods Allow Box Constraints? Should all Work Without Them?

  • I have been experimenting with Optim.jl. Do all optimizers offer box constraints?

  • NOTE: All optimizers I tried can work without box constraints, except the brand new SAMIN.

This example failed to use them:

julia> using Optim

julia> myfun(x)= (x[1]-2)^2 + (x[2]-3)^2 + ((x[1]-2)*(x[2]-3))^2;

julia> optimize(myfun, [ 0.0, 0.0 ], fill(6.0, 2), fill(100.0, 2), Optim.Options(iterations=20000))

See Fminbox.

http://julianlsolvers.github.io/Optim.jl/latest/user/minimization/

1 Like

thx, chris. I thought I had followed it, but did not look close enough. I wonder whether this is sort of usage buglet, too. See

julia> optimize(myfun, [ 0.0, 0.0 ], fill(6.0, 2), fill(100.0, 2), Optim.Options(iterations=20000))
Results of Optimization Algorithm
 * Algorithm: Nelder-Mead
 * Starting Point: [100.0,100.0]
 * Minimizer: [1.9999890189817244,3.0000111941805336]
 * Minimum: 2.458924e-10
 * Iterations: 58
 * Convergence: true
   *  √(Σ(yᵢ-ȳ)²)/n < 1.0e-08: true
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 113

above happily seems to work but ignores the box constraints. instead, it should do the same thing as the following.

now, omit the Optim.Options, and I get the correct error message:

julia> optimize(myfun, [ 0.0, 0.0 ], fill(6.0, 2), fill(100.0, 2))
ERROR: Initial position must be inside the box

Moreover, with the Options, it can fail altogether. Are Optim.Options incompatible with Fminbox?

julia> optimize(myfun, [ 50.0, 50.0 ], fill(6.0, 2), fill(100.0, 2), Fminbox{NelderMead}(), Optim.Options(iterations=200000))
ERROR: No default objective type for Optim.Fminbox{Optim.NelderMead}() and (myfun, [50.0, 50.0], [6.0, 6.0]).

I asked a similar question before.

Fminbox behaves a bit differently than the unconstrained Optim algorithms. In the future we will hopefully manage to merge these approaches in a more intuitive way. I would recommend that you use it with the first-order optimization algorithms, rather than NelderMead etc.

Please read all the instructions under “Box minimization” in the docs carefully.

Regarding the passing of options. In Fminbox, the options to the outer optimization algorithm (which handles the box constraints) are specified with keyword arguments. See the source code. Options to the inner optimizer, such as GradientDescent, or LBFGS, is passed via the keyword argument optimizer_o.

To use box constraints with LBFGS, you can do the following. Note that this will calculate derivatives using finite differences. It tells Fminbox to run 10 outer iterations, and LBFGS to run 2 iterations for each time it is called.

f(x)= (x[1]-2)^2 + (x[2]-3)^2 + ((x[1]-2)*(x[2]-3))^2
x0 = fill(50.0, 2)
res = optimize(f, x0, fill(6.0,2), fill(100.0,2), Fminbox{LBFGS}(); show_trace=true, optimizer_o = Optim.Options(show_trace=true))
1 Like

To clean up this interface a bit, it might make sense to make this a dedicated function fminbox(obj, x, lower, upper, inner_method, inner_options, possibly_fminbox_options) or similar…

2 Likes

piping in from the end user perspective, I think it could also be

Options.optim( min=…, max=… )

with intelligence coded into optimize to handle this appropriately.

/iaw

1 Like

I have a plan, stay tuned :slight_smile:

2 Likes

You can take part in the discussion at https://github.com/JuliaNLSolvers/Optim.jl/pull/584 where I’ve moved some things around. (still wip, but you can hopefully see what I aim for)

2 Likes

What is the correct synthax now ? everything that I try keeps returning weird error messages and the doc isn’t helping.

What did you try? Doesn’t the doc example work?

1 Like

The problem (maybe ?) comes from the fact that I have a function with constant parameters so I have to give an input like this :

result = optimize(params -> cost(params,x,y,folds),start, LBFGS())

this works fine (although suuuuper slow), and I thought that giving constraint might speed the process up since I now that the parameters I am trying to optimize are both positive.

I have tried doing like in the docs :

result = optimize(params -> cost(params,x,y,folds),fill(0,2),fill(Inf,2),start, Fminbox(LBFGS())

I also tried without Fminbox, I hve tried the order you discussed in this thread :

result = optimize(params -> cost(params,x,y,folds),start,fill(0,2),fill(Inf,2), Fminbox(LBFGS()))

and I also tried the {} around Fminbox, nothing worked so far.

Okay… Probably it’s the fill(0, 2) that should be fill(0.0, 2), but without seeing the error it’s impossible for me to help, because I don’t have cost either :slight_smile:

1 Like

Alright, I modified it to fill(0.0,2), and still get the same answer from julia.
the following syntax

result = optimize(params -> cost(params,x,y,folds),start,fill(0.0,2),fill(1000.0,2),  Fminbox{LBFGS}())

returns

LoadError: MethodError: no method matching Fminbox{LBFGS,T,P} where P where T()
in expression starting at C:\Users\cnelias\Downloads\ChangePointDetection3.jl:71
#lsdd3#554(::Int64, ::Function, ::Array{Float64,1}, ::Array{Float64,1}) at ChangePointDetection3.jl:57
lsdd3(::Array{Float64,1}, ::Array{Float64,1}) at ChangePointDetection3.jl:54
top-level scope at util.jl:156

Trying it like in the docs, i.e, without Fminbox returns :

 LoadError: MethodError: objects of type Array{Float64,1} are not callable
Use square brackets [] for indexing an Array.
in expression starting at C:\Users\cnelias\Downloads\ChangePointDetection3.jl:71
(::getfield(NLSolversBase, Symbol("#fg!#8")){getfield(Main, Symbol("##586#587")){Int64,Array{Float64,1},Array{Float64,1}},Array{Float64,1}})(::Array{Float64,1}, ::Array{Float64,1}) at abstract.jl:13
value_gradient!!(::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Array{Float64,1}) at interface.jl:82
initial_state(::LBFGS{Nothing,LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##22#24"))}, ::Optim.Options{Float64,Nothing}, ::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Array{Float64,1}) at l_bfgs.jl:158
optimize(::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Array{Float64,1}, ::LBFGS{Nothing,LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##22#24"))}, ::Optim.Options{Float64,Nothing}) at optimize.jl:33
#optimize#89 at interface.jl:130 [inlined]
optimize(::Function, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::LBFGS{Nothing,LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##22#24"))}, ::Optim.Options{Float64,Nothing}) at interface.jl:128
optimize(::Function, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::LBFGS{Nothing,LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##22#24"))}) at interface.jl:128
#lsdd3#585(::Int64, ::Function, ::Array{Float64,1}, ::Array{Float64,1}) at ChangePointDetection3.jl:57
lsdd3(::Array{Float64,1}, ::Array{Float64,1}) at ChangePointDetection3.jl:54
top-level scope at util.jl:156

The costs function is kinda complicated, should I paste it here ? Overall i think it would be best for me to directly provide the gradient but I don’t think I can find an analytical expression for them…

You are using very old syntax… where are you getting your examples from? The bounds are first and then the initial guess (start should be the fourth input), and it’s Fminbox(LBFGS()) not Fminbox{LBFGS}().

2 Likes

Thanks, it worked. It’s still super slow though, I think the only way around here is to provide the gradients but I don’t know if i can manage to compute them.

Did you profile it? Where is the time spent?

1 Like

2/3 of the time is spend in Linesearch-related functions

try Fminbox(LBFGS(linesearch=LineSearches.BackTracking())) and using LineSearches first obviously.

2 Likes

It’s better this way (3 times faster) thanks ! But it still is too slow for the application I have in mind. A rough estimation of the parameters with nested for-loops is still more effecient. Most of the time seems to be spend in optimization related functions (Backtracking, manifold, interface).
Also I get the following warning :

┌ Warning: Linesearch failed, using alpha = 0.0 and exiting optimization.
│ The linesearch exited with message:
│ Linesearch failed to converge, reached maximum iterations 1000.
└ @ Optim C:\Users\cnelias\.julia\packages\Optim\Agd3B\src\utilities\perform_linesearch.jl:47
#