Hello, I am using Optim.jl to solve a constrained optimization problem.
The gradient is not specified, so finite differences are the default.
This works nicely for the objective, but not for the constraints.
Thank you!
Here ist my mwe:
using Optim, Test
function fun(x) # objective
(1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2 + (x[3]-x[1])^2
end
function con_c!(c, x)
c[1]= x[1]^2 + x[2]^2 # 1st constraint
c[2]= x[2]* sin(x[1])-x[1] # 2nd constraint
c
end
function con_Jac!(J, x) # Jacobian of the constraint
J[1,1] = 2*x[1]; J[1,2] = 2*x[2]
J[2,1] = x[2]* cos(x[1])-1; J[2,2] = sin(x[1])
J
end
function con_Hess!(h, x, λ) # Hessian of the constraint
h[1,1] += λ[1]*2
h[2,2] += λ[1]*2
h
end
x_initial = [0.3, 0.2, 0.1]
df = TwiceDifferentiable(fun, x_initial) # here, automatic differentiation works
lx = [0, 0., 0]; ux = [1., 1., 1.]
lc = [-Inf, -3.]; uc = [0.5^2, 3.0]
# this works:
dfc= TwiceDifferentiableConstraints(con_c!, con_Jac!, con_Hess!, lx, ux, lc, uc)
# this is not working, but it should:
#dfc= TwiceDifferentiableConstraints(con_c!, lx, ux, lc, uc) # no method matching
@show res = optimize(df, dfc, x_initial, IPNewton())