In the blog post linked from the diffeqflux package (DiffEqFlux.jl – A Julia Library for Neural Differential Equations), there is this section down the page:
Notice that we are not learning a solution to the ODE. Instead, what we are learning is the tiny ODE system from which the ODE solution is generated. I.e., the neural network inside the neural_ode layer learns this function:
Thus it learned a compact representation of how the time series works , and it can easily extrapolate to what would happen with different starting conditions. Not only that, it’s a very flexible method for learning such representations. For example, if your data is unevenly spaced at time points t
, just pass in saveat=t
and the ODE solver takes care of it.
Clearly there is something missing after “this function:” which is making this impossible to understand. Can anyone illuminate?