Hello everyone,
I’m trying to optimize the parameters in a nonlinear differential equation towards some known parameters using the ADAMs optimizer (benchmarking different optimizers for later usage). For some initial conditions, the solver is interrupted in the middle of the training process and returns below warning:
Warning: Interrupted. Larger maxiters is needed.
└ @ DiffEqBase \.julia\packages\DiffEqBase\3iigH\src\integrator_interface.jl:329
The appearance of this warning seems to depend on the learning rate of the ADAM optimizer but does not depend on the amount of iterations specified in my iterator object. The warning appears later/does not appear at all when I reduce the learning rate. Where does this error come from?
Below you find the part of my code dealing with the optimization.
function predict_rd()
sol = solve(prob,Tsit5(),p=p,saveat=0.005)
return sol
end
function loss_rd() # loss function
sol = predict_rd()
sol = sol[eval_index,:]
loss = sum(abs2,sol-ref)
if loss <= tol
display("Converged below tolerance - stopping")
Flux.stop()
end
return loss
end
t = 0:0.005:200.0
data = Iterators.repeated((), 500)
opt = ADAM(0.1)
cb = function() #callback function to observe training
display(loss_rd())
return
end
# Display the ODE with the initial parameter values.
curr_sol = solve(remake(prob,p=p),Tsit5(),saveat=0.005)
display(plot(t, curr_sol[eval_index,:], ylim=(-1.25, 1.25), label = "Model"))
display(plot!(t,ref, label = "Target"))
display(plot!(t, og_sweep.(t), label = "Sweep"))
# Train
Flux.train!(loss_rd, params, data, opt, cb = cb)
# Plot results
display(p)
curr_sol = solve(remake(prob,p=p),Tsit5(),saveat=0.005)
display(plot(t, curr_sol[eval_index,:], ylim=(-1.25, 1.25), label = "Model"))
display(plot!(t,ref, label = "Target"))