When minimizing a costly function f(x) using Optim.jl and providing the gradient \nabla f(x), one can use Optim.only_fg!
to compute the gradient and the function itself within the same function to avoid repeating computations like this:
function fg!(F, G, x)
# do common computations here
# ...
if G !== nothing
# code to compute gradient here
# writing the result to the vector G
# G .= ...
end
if F !== nothing
# value = ... code to compute objective function
return value
end
end
Optim.optimize(Optim.only_fg!(fg!), [0., 0.], Optim.BFGS())
See here for more info.
Now, when solving a system of nonlinear equations f(x) = 0 using NonlinearSolve.jl and we want to specify the Jacobian, we have to specify the problem as:
function f(u, p)
# computations here
return fval
end
function df(u, p)
# computations here
return du
end
fn = NonlinearFunction(f, jac = df)
prob = NonlinearProblem(fn, u0, p)
sol = solve(prob, NewtonRaphson( ;concrete_jac = true))
Now, there are some settings (just like in the previous case) where f and df share common costly computations so calling a function like df(u,p) with f(u) inside can be costly. Is there some equivalent Optim.only_fg!
but that applies to NonlinearProblem.jl’s NonlinearFunction
?
Thanks in advance!