DtLessThanMin while err estimate = 1.65e-10

I was troubleshooting a mass-matrix ODE model for nonconvergence due to DtLessThanMin.

It turns out that, with a set of properly initialized nonlinear algebraic equations (i.e., the mass matrix is all zero), OrdinaryDiffEq solvers will fail when a parameter change is applied at an arbitrary time using DiscreteCallback.

The warning message shows that the estimated error is tiny compared with 1:

┌ Warning: dt(1.7763568394002505e-15) <= dtmin(1.7763568394002505e-15) at
│ t=1.0, and step error estimate = 1.654973077755173e-10. Aborting. There is
│ either an error in your model specification or the true solution is
│ unstable.
└ @ SciMLBase 

But according to timestepping, the step should be accepted because the error is less than one.

Question: Why is this step with such a small error not accepted? Do value jumps in algebraic variables impact the scaled error?


I then tried to apply the same parameter change at time t=0. With the same initial condition, OrdinaryDiffEq can successfully solve the system. At least, this verifies the algebraic system after parameter change is solvable.

Question: Is it because, for the very first step, the error is not checked?

I don’t have an MWE yet but would like to ask in case I missed something obvious.

Can you make an MWE? Did you set a tstop some time very shortly in the future?

Thank you for your reply! It could be a problem with my model spec. I’m still working on it.