I am minimizing an objective function using a local optimization algorithm and I am supplying it with the gradient generated by ForwardDiff. However, my objective function contains functionals including the max(x,0) operator, which is non-differentiable at x =0. Experiments shows that the forwarddiff gradient and the finite difference gradient agrees, and I am able to find the minimum. However, how is the max operator dealt with in `DualNumbers.jl`

? I did not find anything about it in the source. Is it smoothed somehow?

# How is the max operator dealt with in DualNumbers.jl?

**vgdev**#1

**stevengj**#2

You should really re-write your optimization problem to be differentiable, since otherwise your optimizer might get stuck (or at least have very slow convergence). In problems involving `max`

or `min`

, you can generally do this by a standard trick involving adding dummy variables and constraints.

See, for example: http://nlopt.readthedocs.io/en/latest/NLopt_Introduction/#equivalent-formulations-of-optimization-problems

**stevengj**#3

I think it just computes a one-sided derivative. i.e. it ignores the discontinuity (just picking the derivative from one side or the other of the kink in the `max`

function). See https://github.com/JuliaDiff/DualNumbers.jl/issues/53

If your optimizer always stays in the region where x is always positive (or negative), you should be fine. If it jumps around then you might be in trouble (in particular, quasi-Newton algorithms should get very confused)