How is the max operator dealt with in DualNumbers.jl?

I am minimizing an objective function using a local optimization algorithm and I am supplying it with the gradient generated by ForwardDiff. However, my objective function contains functionals including the max(x,0) operator, which is non-differentiable at x =0. Experiments shows that the forwarddiff gradient and the finite difference gradient agrees, and I am able to find the minimum. However, how is the max operator dealt with in DualNumbers.jl? I did not find anything about it in the source. Is it smoothed somehow?

You should really re-write your optimization problem to be differentiable, since otherwise your optimizer might get stuck (or at least have very slow convergence). In problems involving max or min, you can generally do this by a standard trick involving adding dummy variables and constraints.

See, for example: NLopt Introduction - NLopt Documentation

I think it just computes a one-sided derivative. i.e. it ignores the discontinuity (just picking the derivative from one side or the other of the kink in the max function). See max when values agree? · Issue #53 · JuliaDiff/DualNumbers.jl · GitHub

So if I actually never evaluate max(x,0) at x =0, i should be okay?

If your optimizer always stays in the region where x is always positive (or negative), you should be fine. If it jumps around then you might be in trouble (in particular, quasi-Newton algorithms should get very confused)

1 Like