Hi, I am trying to solve a highly dimensional nonlinear optimization problem for which I have a function that maps a vector into a value and a its jacobian. I want to solve the problem providing the gradient but I can not find good resources that explain how to do so using the Optim package. Can anyone point to resources or provide a very simple example illustrating how to set up the problem?

Miguel.

Do you have only a single function to optimize? Are there other constraints? Integrality or bounds on the variables?

You can register a user-defined function in JuMP and provide the analytic gradient: Nonlinear Modeling · JuMP

1 Like

Thanks!

I am trying to adapt the example in the documentation so that it takes in a vector instead its elements explicitly because in my problem I am solving a 15 dimensional problem. However, when I do this it assumes it is univariate function. I illustrate this below:

When I call the function I run into this error:

Any ideas on how to fix this?

Thanks a lot!

I think you should use splat f(x…).

There is an example on this in Nonlinear Modeling · JuMP.

``````f(x...) = (1 - x)^2 + 100 * (x - x^2)^2
function ∇f(g, x...)
g = 400 * x^3 - 400 * x * x + 2 * x - 2
g = 200 * (x - x^2)
return
end
function ∇²f(H, x...)
H[1, 1] = 1200 * x^2 - 400 * x + 2
# H[1, 2] = -400 * x  <-- Not needed. Fill the lower-triangular only.
H[2, 1] = -400 * x
H[2, 2] = 200.0
return
end

model = Model()
register(model, :rosenbrock, 2, f, ∇f, ∇²f)
@variable(model, x[1:2])
@NLobjective(model, Min, rosenbrock(x, x))
``````

Ooops. I never replied to this.

@Adedayo_Yusuff is correct. You need to use the splatted syntax. You cannot pass a `Vector` as a single argument.