I need to optimize a function f(u)
defined over a closed domain u =[0,1]. I was happily using Newton’s method and FowardDiff for this until I tried upgrading to v1.0. Now, ForwardDiff intentionally gives the derivative on the right of u. This means that while f(u=1.)
is defined, f(u=Dual(1.,1.))=Dual(NaN,NaN)
.
I see this change marked at the top of the README, but I don’t see any suggestions for how to get the derivative on the left of u or any other “designed” approach to deal with closed domains like this.
function LeftDeriv(f, x)
ForwardDiff.derivative(t -> -f(-t), -x)
end
I just want to mention this as it was much quicker in my use-case: have you tried avoiding autodiff and just using a designated algorithm for a bounded interval, as described in the section “Minimizing a univariate function over a bounded interval” here: Minimizing a function · Optim ? I know this doesn’t address the OP, but might be useful nonetheless
This certainly gives the left derivative. However, you will need logic to use right derivatives on the left boundary and left derivatives on the right.
I ended up allowing u to overshoot the interval and then clamp the results later. Clearly, this will not work for functions which are undefined (or without defined derivatives) outside the interval.
My use case is pretty specific, so I’m not using Optim.