LoadError when using interpolations as input for a neural ode

Okay, I can see how to get this working.

First of all, unsurprisingly, it seems like the error is as a result of trying to differentiate LinearInterpolation.

julia> gradient(broadband, 0.5)
ERROR: ArgumentError: unable to check bounds for indices of type Interpolations.WeightedAdjIndex{2,Float64}

Some part of that package apparently doesn’t play well with autodiff.

The implication of this is that you can’t compute d(du/dt)/dt.

Now, fortunately, you don’t need to! It just so happens that you only need d(du/dt)/dt to backpropagate through an ODE wrt the initial time point tspan[1], which in your case is fixed. (Not a fact that’s terribly obvious, I don’t think. I only know this because I’ve worked on problems of this exact type before, and had to write the gradient calculations from scratch.)

As a result, it should suffice to simply dummy the gradient wrt t, and the rest of the gradients should be calculated just fine:

import Zygote

function dudt(u, p, t)
    b = Zygote.ignore(()->broadband(t))
    nn_model(vcat(u[1], b, u[3], u[4]), p)
end

This is a bit of a hack, of course, but it seems to work.

This aside, neural differential equations of this type (with time-dependent input) are known as neural controlled differential equations. There’s been a line of work on this you might find interesting:
Neural Controlled Differential Equations for Irregular Time Series, NeurIPS 2020
Neural Rough Differential Equations for Long Time Series, ICML 2021
PyTorch library: torchcde.
(+ one more paper coming soon)

And more tangentially (kind of just for fun wrt this discussion), using a SDE+CDE as a generator-discriminator pair in a GAN:
Neural SDEs as Infinite-Dimensional GANs, ICML 2021.

I hope that helps.

3 Likes