This is, strictly speaking, not a Julia but a math question, but I am running into this with AD and optimization.
I am solving a parametric system
for x given \theta. I could implement this as
function residual(x, θ) F = f(x, θ) G = g(x, θ) F .- G end
but it may make more sense near the optimum to use a criterion like
@. (F - G)/F or
@. (F - G)/G. But this may fail if either
F ≈ 0 or
G ≈ 0.
Some texbooks recommend something like
@. (F - G)/max(1, F, G), but then the derivatives are not continuous.
Is there a “standard” way of doing what I want in a continuously differentiable way? I think I could combine a softmax to get this, but thought I would ask first here.