The EarlyStopper
objects defined in EarlyStopping.jl consume a sequence of numbers called losses generated by some external algorithm  generally the training loss or outofsample loss of some iterative statistical model  and decide when those losses have dropped sufficiently to warrant terminating the algorithm.
The package is mainly intended for developers in the ML space. There is a plan to use it in MLJ.jl in a model wrapper for controlling iterative models (including “selftuning” models).
A number of commonly applied stopping criteria are included outofthebox, including all those surveyed in the paper Prechelt, Lutz (1998):
“Early Stopping  But When?”, in Neural Networks: Tricks of the Trade, ed. G. Orr, Springer.
criterion  description  notation in Prechelt 

Never() 
Never stop  
NotANumber() 
Stop when NaN encountered 

TimeLimit(t=0.5) 
Stop after t hours 

GL(alpha=2.0) 
Stop after “Generalization Loss” exceeds alpha

GL_α 
PQ(alpha=0.75, k=5) 
Stop after “Progressmodified GL” exceeds alpha

PQ_α 
Patience(n=5) 
Stop after n consecutive loss increases 
UP_s 
Disjunction(c...) 
Stop when any of the criteria c apply 