[ANN] EarlyStopping 0.1.0

The EarlyStopper objects defined in EarlyStopping.jl consume a sequence of numbers called losses generated by some external algorithm - generally the training loss or out-of-sample loss of some iterative statistical model - and decide when those losses have dropped sufficiently to warrant terminating the algorithm.

The package is mainly intended for developers in the ML space. There is a plan to use it in MLJ.jl in a model wrapper for controlling iterative models (including “self-tuning” models).

A number of commonly applied stopping criteria are included out-of-the-box, including all those surveyed in the paper Prechelt, Lutz (1998):
“Early Stopping - But When?”, in Neural Networks: Tricks of the Trade, ed. G. Orr, Springer.

criterion description notation in Prechelt
Never() Never stop
NotANumber() Stop when NaN encountered
TimeLimit(t=0.5) Stop after t hours
GL(alpha=2.0) Stop after “Generalization Loss” exceeds alpha GL_α
PQ(alpha=0.75, k=5) Stop after “Progress-modified GL” exceeds alpha PQ_α
Patience(n=5) Stop after n consecutive loss increases UP_s
Disjunction(c...) Stop when any of the criteria c apply
23 Likes

It is nice!

It is very useful to be able to stop after a certain time, of after several worse results. However, I miss a criterion for a maximum number, not only when the results got worse, in any case. Also, it is not clear if several criteria can be combined.

1 Like

Thanks for the feedback!

Yes, can can combine multiple criteria. Just give the EarlyStopper constructor multiple arguments as in first readme example. You can call message on the stopper to see which one applied.

Happy to add new criteria - just open an issue on the repo. PR’s also welcome.

3 Likes

You are right about the multiple criteria, I did not realise that you can combine all criteria in the same EarlyStopper.

Thank you for be open to my suggestion, I have just done a PR implementing my suggestion. Initially the name is MaximumChecks, but I am open to rename it. It is a nice package.