Here’s a open ended topic that I haven’t seen people mention, explicitly in literature.
In order to integrate neural ODEs, there needs to be guarentees regarding the existence of its solutions. And any arbitrary NN approximation, be it from a CNN, GNN or simply a SHLNN (single hidden layer, fully connected NN); is not necessarily Lipschitz continuous. While there is work on Lipschitz continuous neural networks; there is little reference to such works in the neural ode literature.
@ChrisRackauckas@avikpal Have you thought about this? If I did miss any papers from you; could you please point me to them?
Why wouldn’t they be Lipshitz? NNs are usually finite compositions of Lipshitz functions (typically linear operations like matrix multiplies and convolutions & Lipshitz activation functions like ReLUs), no?