How to debug ODE model aborting midway on sensitivity computation

Hi there,

I have an ODE model that results from the spatial discretization of a PDE including reaction terms. The rate coefficient of the reaction is calculated from an interpolation that is provided externally.

I can run the forward simulation / ODE solve just fine. However, when I want to compute sensitivities wrt. interpolation coefficients, the solver aborts at different time points (“dt was forced below floating point epsilon …”). These time points do not coincide with interpolation nodes.

Sensitivities wrt. other system parameters work. Only interpolation coefficients cause these problems. So far, I was not able to distill it down to a tractable MWE.

Do you have some ideas on how to debug this?

Which sensitivity method?