To your point, and to use a real example from the forum, consider this post
I was reading the documentation and on this page Frequently Asked Questions of DifferentialEquations.jl, in the Stability and Divergence section, it says:
if you see these instability warnings during a parameter estimation process, this is likely the underlying problem. Simply check
sol.retcode != :Success
and throw anInf
cost and most optimizers will reject steps in those parameter regimes!I wonder which optimizers will do that? and how exactly do they “reject the steps”? (e.g.: do they simply skip the current step? or do they randomly initialize a new set of parameters?)
The accepted solution was:
They skip the current step and do some line search thing. I know the Optim.jl algorithms support this. I need to dig around to see which others do it well.
While a ChatGPT solution looks like:
The specific optimizers that will reject steps in parameter regimes where the solution is unstable will depend on the optimization algorithm being used. DifferentialEquations.jl uses a variety of optimization algorithms, so it’s not possible to say which ones will reject steps in unstable regimes without knowing which algorithm is being used.
In general, optimizers will often have a way to handle situations where the optimization process encounters an unstable or invalid solution. This could involve skipping the current step and trying a different set of parameters, or it could involve randomly initializing a new set of parameters and starting the optimization process over from the beginning. The specific details will depend on the optimization algorithm being used.
Yuck.