 # Stop optimization problem when the objective is negative?

#1

i have a Helmholtz function, that needs to be optimized, but i need to stop the optimization algorithm if the value of the objective function is negative, any ideas?, im using Optim.jl

``````using Optim
N = 10
c = rand(N) .- 0.5
x0 = rand(N)
f(x) = sum(c.*x.^collect(1:length(x))) # a nth grade polinomial as example
#my objetive function is not a polynomial
lb = zeros(N)
ub = ones(N)
#this is straightly copied from the Optim.jl guide
inner_optimizer=GradientDescent(linesearch=LineSearches.BackTracking(order=3))
sol = optimize(f,lb,ub,x0,Fminbox(inner_optimizer)
``````
#2

With NLopt.jl you can specify a lower bound for the objective function as a stopping condition.

1 Like
#3

that would suffice!, thanks!

#5

You can minimize `max(0, f(x))` or `max(0, f(x))^2` if you want to make it differentiable.

#6

This is more of a theoretical question, but i’m implementing a tunneling algorithm for psudo-global minimization, where i can instead use some of the already written algorithms in NLopt, the idea is to find a negative value as soon as possible (the reason for the stop condition). instead of using a tunneling algorithm (finding local minima, using a tunneling function to carve away from that minima and keep searching), i can use a global optimization method that can achieve the same result. my functions are n-differentiable, and the precision in this step is not required, just the sign. what of the available algorithms in NLopt can you recommend me in this case?