I am trying to optimize an objective function which itself contains auto-differentiation, but the optimization fails, so I tried with a trivial example and it still fails. If anyone can suggest any approaches, that will be great.
Thanks!
See the trivial example below:
using Optim
using ForwardDiff
## Objective Function
function obj(x)
return 2 * (x[1] - 1)^2 + 4 * (x[1] - 1) # this works
## return 2.0 * (x[1] - 1)^2 + ForwardDiff.derivative(x -> 2.0 * (x[1] - 1)^2, x) ## fails
end
opt_options = Optim.Options(
g_abstol=1e-4,
f_abstol=1e-3,
iterations=1000,
f_calls_limit=10000,
store_trace=true,
show_trace=true,
show_every=10
)
lower_bounds = [-10.0]
upper_bounds = [10.0]
initial_params = [5.0]
optimizer_to_use = Fminbox(BFGS())
solution = Optim.optimize(
obj,
lower_bounds,
upper_bounds, # Required by Fminbox
initial_params,
optimizer_to_use,
opt_options;
autodiff = :finite,) # :finite or :forward, fails with using ForwardDiff within Obj
## minimum is -2.0 at x = 0.0
solution.minimizer, solution.minimum
Haven’t ran it, but at first glance, shouldn’t this be ForwardDiff.derivative(x -> 2.0 * (x - 1)^2, x[1])? x[1] is your input, and the closure’s x is a separate variable for the function to differentiate.