# How to estimate many successive parameters of system of ODEs?

Hi,

I have a system of ODEs for which I’d like to estimate its parameters by fitting it to observational data. The only “interesting” part of this problem is that the parameters vary on a daily basis.
A naive approach to this problem would be to perform the optimization for each day individually and use the results as the initial conditions for the next day and successively estimate all the parameters.
However, this seems cumbersome and Julia’s support for solving ODEs and parameter estimation is nearly overwhelming. That’s why I’m hoping that I just missed a package or method and I would be very grateful, if someone could point me into the right direction.

Hello and welcome to the community

Are these parameters varying “slowly”, i.e., like a drift over time? If so, you might be able to use the techniques I outline in this video

Hello baggepinnen and thank you for the warm welcome!

Hmm, I hadn’t thought of data assimilation techniques. In the past, I used a fortran library for ensemble Kalman filters and it was a pain to set up and also not very robust in me case when using it for parameter estimation.
But I will definitely give it a shot in Julia.

With ‘“slowly”, i.e. like a drift over time’ you mean that the parameters are continuous over time? - Yes they are. Although, in your example you managed to estimate a discontinuity in the parameter. But you asked, in case there are many jumps in the parameters and the Kalman filter might be too slow to keep up with the jumps?

Thanks again for the help!

A Kalman filter uses a statistical model + linear dynamics to model the parameter drift. The simplest possible model is constant dynamics (zero derivative) + Gaussian noise, which is what I used in the example. This works well if parameters are varying slowly, but can also work for discontinuous jumps, provided that you accept that it takes a while before the parameter estimate converges after the jump like in the video. If you know more about the drift of the parameters, such as

• they vary smoothly (continuous derivative)
• they have a seasonal trend

etc., then you can bake this knowledge into the Kalman filter in order to improve the estimates.

Yeah pretty much, it all depends on how the parameters vary and how strict your requirements are. Here’s and example where one parameter/state variable is binary (true/false), in which case I use a particle-filter instead. If parameters tend to jump, but you know when this happen, you can also experiment with nonlinear Kalman filters (UKF/EKF) where you inflate the noise covariance for the parameter drift at the time of the jump in order to allow the filter to converge faster. I show how to do this in this example.