Lipschitz continuity -- Neural ODEs

Here’s a open ended topic that I haven’t seen people mention, explicitly in literature.

In order to integrate neural ODEs, there needs to be guarentees regarding the existence of its solutions. And any arbitrary NN approximation, be it from a CNN, GNN or simply a SHLNN (single hidden layer, fully connected NN); is not necessarily Lipschitz continuous. While there is work on Lipschitz continuous neural networks; there is little reference to such works in the neural ode literature.

@ChrisRackauckas @avikpal Have you thought about this? If I did miss any papers from you; could you please point me to them?

Thanks in advance.

Why wouldn’t they be Lipshitz? NNs are usually finite compositions of Lipshitz functions (typically linear operations like matrix multiplies and convolutions & Lipshitz activation functions like ReLUs), no?

There are even algorithms to automatically estimate the Lipshitz constant of DNNs and to bound the Lipshitz constant during training.

Or do you mean you are worried that the Lipshitz constant may be large? But even if it is large, the ODE solution will still exist, no?

1 Like

They are Lipschitz, so I don’t understand the fuss.

And then if the Lipshitz constant is large, this paper goes into detail about how the adjoint is changed to handle that case:

1 Like