How to find a root at the diverging boundary?

Say if I have a function f(x; \lambda) which is continuous and smooth where x \in (0, \infty) for any 0<\lambda < \lambda^*. For each \lambda, there is a single minimum at the same location x^*, f(x^*; \lambda). f(x^*; \lambda) is a decreasing function with respect to \lambda, and f(x^*; \lambda^*) = 0.

I know, for any \lambda < \lambda^*, it should be easy to find the minimum using Optim.Brent. However, it seems impossible to use Roots.find_zero to locate \lambda^*, since if Roots propose a \lambda > \lambda^*, f(x; \lambda^*) is diverging.

I am not trained in numerics. Does anyone know such problem and have some idea?

Do you have a bracketing interval? If so use that and this can’t happen. Otherwise, you may need a better initial guess. (The first step is a secant step in the default algorithm, and this may be an issue if your function is flat near the guess.)

Note that the target solution \lambda^* is itself on the boundary. Therefore, there will no bracketing interval available.

Could you look at the inverse of f? Because 1/f should go to zero at \lambda^\ast, correct? Finding the smallest \lambda with 1/f=0 might be easier numerically…

The hypothesis are not that easy to understand. But basically, g(\lambda) := f(x^*,\lambda) is a decreasing function of \lambda, and you look for a zero of g. Now, you have g(\lambda^*) = 0 and g(\lambda^*+0)=\infty? :thinking:

1 Like

Exactly. Is it equivalent to finding the diverging point of g(\lambda)? One way I think of is starting from a small enough \lambda, then gradually increasing \lambda with proper step size and checking whether g(\lambda) is diverging?

You could attempt to locate the discontinuity with interval arithmetic. Essentially apply g to nested intervals and check if the upper bound of the image is infinite or finite.

I you know that g is convex, I’d say Newton iterations will be smaller than lambda*