I am using Optimization.jl
to train a neural network. I’d like to be able to terminate the convergence based on the behavior of the loss as evaluated on a validation dataset, rather than just setting a limit for number of training iterations. I’m imagining that one might use the callback function for this, evaluating the model on the validation set (which may have to be a global variable to be accessible) and checking whether it’s increasing or decreasing. Has anyone implemented something like this, or are there Optimization
tools for this I’m not seeing?
Optimization.jl’s callbacks do this. You always return false
to keep going, or true
to stop the optimization. For example:
callback = function (state, l; doplot = false) #callback function to observe training
return l < 0.5
end
would make it halt when the loss is less than 0.5
1 Like