I’m trying to use ReverseDiffSparse
to return the gradient and Hessian of a scalar valued function f
. I only need to evaluate the gradient and Hessian (i.e., I don’t need to set up a whole model). I saw from the JuMP Documentation that I can access automatic differentiation through JuMP using MathProgBase
per the following example.
m = Model()
@variable(m, x)
@variable(m, y)
@NLobjective(m, Min, sin(x) + sin(y))
values = zeros(2)
values[linearindex(x)] = 2.0
values[linearindex(y)] = 3.0
d = JuMP.NLPEvaluator(m)
MathProgBase.initialize(d, [:Grad])
objval = MathProgBase.eval_f(d, values) # == sin(2.0) + sin(3.0)
∇f = zeros(2)
MathProgBase.eval_grad_f(d, ∇f, values)
# ∇f[linearindex(x)] == cos(2.0)
# ∇f[linearindex(y)] == cos(3.0)
Three questions:
-
Is this using
ReverseDiffSparse
? -
Is there a better way of returning the gradient and Hessian than setting up a model and using
MathProgBase.eval_grad_f
? -
(not necessary if (2) is solved) If I define my function
f(x,y)
as returningsin(x) + cos(y)
how can I insert this into@NLobjective(m, Min, sin(x) + sin(y))
? I tried@NLobjective(m, Min, f(x,y))
, but I get the following error:
Unrecognized function "f" used in nonlinear expression.
Stacktrace:
[1] error(::String) at ./error.jl:21
[2] include_string(::String, ::String) at ./loading.jl:515
Thanks!