Trouble with CPU+GPU adjoints for UDEs

Hi all! I am currently trying to create UDEs using a GPU backend for the neural network only. I have been following the examples here, but have not been able to get anything to work. I was wondering if anyone might know of or be willing to provide an alternative example of this, as I suspect this one might be out of date. Thank you!

That code from early 2020 would need some updates. GPU-based MNIST Neural ODE Classifier · DiffEqFlux.jl might be a better starting point?