This is my first attempt at using diffeq. I just don’t get much help from the stacktrace.
I’m trying to train a neural network that is inside a differential equation like a reinforcement learning problem.
I get a warning that says:
**┌ Warning: dt <= dtmin. Aborting. There is either an error in your model specification or the true solution is unstable.**
**└ @ DiffEqBase C:\Users\myUserID\.julia\packages\DiffEqBase\3iigH\src\integrator_interface.jl:343**
Then I get this (truncated):
ERROR: BoundsError: attempt to access 1-element Array{Float64,1} at index [0]
Stacktrace:
[1] getindex at .\array.jl:809 [inlined]
[2] (::DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Array{Float64,1},Nothing,Base.OneTo{Int64},UnitRange{Int64},UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Array{Float64,1},DiffEqBase.ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Array{Array{Array{Float64,1},1},1},DiffEqBase.ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},DiffEqBase.ODEFunction{true,typeof(Main.myFunc!),UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Symbol,Any,Tuple{Symbol,Symbol},NamedTuple{(:callback, :saveat),Tuple{DiffEqBase.CallbackSet{Tuple{DiffEqBase.ContinuousCallback{Main.FlyPronav_vsTagBot.var"#1#2",typeof(DiffEqBase.terminate!),typeof(DiffEqBase.terminate!),typeof(DiffEqBase.INITIALIZE_DEFAULT),Float64,Int64,Nothing,Int64}},Tuple{}},Float64}}},DiffEqBase.StandardODEProblem},OrdinaryDiffEq.Vern9,OrdinaryDiffEq.InterpolationData{DiffEqBase.ODEFunction{true,typeof(Main.FlyPronav_vsTagBot.flyProNav!),UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Array{Array{Float64,1},1},Array{Float64,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Vern9Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Vern9Tableau{Float64,Float64}}},DiffEqBase.DEStats},DiffEqSensitivity.CheckpointSolution{DiffEqBase.ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Array{Array{Array{Float64,1},1},1},DiffEqBase.ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},DiffEqBase.ODEFunction{true,typeof(Main.myFunc!),UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Symbol,Any,Tuple{Symbol,Symbol},NamedTuple{(:callback, :saveat),Tuple{DiffEqBase.CallbackSet{Tuple{DiffEqBase.ContinuousCallback{Main.FlyPronav_vsTagBot.var"#1#2",typeof(DiffEqBase.terminate!),typeof(DiffEqBase.terminate!),typeof(DiffEqBase.INITIALIZE_DEFAULT),Float64,Int64,Nothing,Int64}},Tuple{}},Float64}}},DiffEqBase.StandardODEProblem},OrdinaryDiffEq.Vern9,OrdinaryDiffEq.InterpolationData{DiffEqBase.ODEFunction{true,typeof(Main.FlyPronav_vsTagBot.flyProNav!),UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Array{Array{Float64,1},1},Array{Float64,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Vern9Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Vern9Tableau{Float64,Float64}}},DiffEqBase.DEStats},Array{Tuple{Float64,Float64},1},NamedTuple{(:reltol, :abstol),Tuple{Float64,Float64}}},DiffEqBase.ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},DiffEqBase.ODEFunction{true,typeof(Main.FlyPronav_vsTagBot.flyProNav!),UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Symbol,Any,Tuple{Symbol,Symbol},NamedTuple{(:callback, :saveat),Tuple{DiffEqBase.CallbackSet{Tuple{DiffEqBase.ContinuousCallback{Main.FlyPronav_vsTagBot.var"#1#2",typeof(DiffEqBase.terminate!),typeof(DiffEqBase.terminate!),typeof(DiffEqBase.INITIALIZE_DEFAULT),Float64,Int64,Nothing,Int64}},Tuple{}},Float64}}},DiffEqBase.StandardODEProblem},DiffEqBase.ODEFunction{true,typeof(Main.myFunc!),UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing}})(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at C:\Users\myUserID\.julia\packages\DiffEqSensitivity\WiCRA\src\local_sensitivity\interpolating_adjoint.jl:108
The training command looks like this:
res = DiffEqFlux.sciml_train(loss_adjoint, θ, ADAM(0.1), cb = cb_plot, maxiters = 100)
Should I use something besides ADAM as an optimizer on this?