I am using DiffEqFlux to train neural ODEs. Following the example at this link, I am using the function DiffEqFlux.scil_train to train the neural network. I’d like to pass/customize some things with the optimizer. For example, I might like to specify f_tol, f_abstol, show_trace. So, I might send “Optim.Options(f_tol=1.0e-3, f_abstol=1.0e-6, show_trace=true)” as an argument to the “sciml_train” function. What exactly would the syntax like if I want to do this? I’m trying to get a better fit with my data, so I want to pass in a low tolerance for f, so that the training completes with a lower loss than the loss it returns now.
Thanks,
Matt
Any extra keyword arguments get thrown down to Optim:
so you can just add show_trace = true, f_tol = 1e-3
, etc.
Note that sciml_train
ended up becoming a pretty big setup due to wrapping all of the Julia nonlinear optimization libraries we could find and throwing every reverse mode AD on it… so that’s all getting refactored out to https://github.com/SciML/GalacticOptim.jl fairly soon. After it gets moved out and the API gets cleaned up for its role as a differentiable wrapper over optimization, we’ll make sure all of this gets proper documentation.