Hi,

I am currently solving an ODE in DifferentialEquations.jl which has explicit time dependence. I arrive at a solution for a larger time t_{max} with the solution saved at 1000 time points in between. Now I want to zoom in the time evolution between t_1 and t_2 where 0<t_1<t_2<t_{max}. Now since I have saved the solution at t_1 using `saveat`

option, I start the ode at t_1 and run it to t_2 with a 1000 timepoints to see what is happening in between. The solution at t_2 differs from the longer run. I think it is a consequence of the time dependence of the problem and I think the integrator is taking timesteps which does not give it a stable solution in one of the cases.

My solution is to make the integrator take smaller steps such that this might not happen. The easiest way to do this would be to increase relative and absolute tolerance but this increases the runtime by quite a lot. Going up to `abstol=reltol`

=10^{-7} the problem did not resolve itself. I was wondering if one could force the integrator to take smaller timesteps than dt_{cont} for both runs on top of the adaptive timestep. If anyone could help me either with the timestep implementation on top of the adaptive timestep or anything that I am missing in my problem I would be very grateful.

Thanks,

integrators store an interpolation, so you can see the value between samples.

I am not sure I understand the issue here. Numerical algorithms for ODEs return approximations of the true solution. The quality of such approximations is determined by the parameters of the algorithm, the sizes of the time steps belonging to these parameters. That the solutions for different parameters do not agree is not a consequence of the properties of the problem but rather the (parameters of the) method.

If you want to impose an upper bound on the size of the steps for an algorithm with an adaptive step-size control, there is the `dtmax`

keyword. See the section “Basic stepsize control” at Common Solver Options (Solve Keyword Arguments) · DifferentialEquations.jl.

True. But my gridsize and time evolution is pretty large so I have turned off the interpolation since the system runs out of memory in that case. Thanks for the suggestion.

Thanks for the `dtmax`

keyword. I had missed it completely while looking through the excellent documentation. Have to see if this solves the problem.

I agree that the parameters of the method are essential for the solution. My ode function depends on the instanteneous time at which the function is being evaluated. So if I run the time evolution from 0 to t_{max} then ideally it should go through the times t_1, t_2 and converge to a solution. Ideally I should get the same solution if I only time evolve from t_1 to t_2 where 0<t_1<t_2<t_{max}.