Flux with early stop and epochs

I’ve found that Flux provides early stop by using callbacks.

Despite of the instruction, it is not compatible with @epochs in my case.

For example, codes looks like the following keep run the epochs loop.

valid_check = function()
    valid_loss = loss(x_test', θ_test', V_test')
    @show valid_loss
    valid_loss < 0.1 && Flux.stop()
@epochs 10 Flux.train!(loss, Flux.params(m), data, opt, cb=Flux.throttle(valid_check, 1))

Anyone knows what I should do…?

Is valid_loss ever < 0.1? What happens if you remove the check and always call Flux.stop()? If that works, then either the network isn’t learning, the loss threshold is too low or something funky is going on with the call to loss.

Yes, it has been satisfied but keeps running the epochs loop.

Always calling Flux.stop() only updates the networks once and terminates only each epochs, that is, update iteration is exactly the same as total number of epochs.

So, first of all, iHany, I’m new to Julia so this is probably not the best way to solve your problem…
As far as I can see, @epochs is just a macro to run a “for loop” but also displaying some more info. You can get the outcome you are looking for by replacing the @epochs with

for i=1:10
     Flux.train!(loss, Flux.params(m), data, opt, cb=Flux.throttle(valid_check, 1))
     if valid_loss < 0.1

Although this doesn’t look as elegant as @epochs, I don’t think it will make any difference to the efficiency of your program.

Actually I’m doing what you suggested to detour the problem but it’s slightly different from a desired way.

For example, with that approach, the epochs loop will be terminated (by break) only when the valid_loss satisfies the criterion evaluated “outside” Flux.train!.

This means that the alternative cannot consider the cases when valid_loss satisfies the criterion within Flux.train!.

Anyway, the way you suggested would also work well in most of practical problems :slight_smile:

Again, this is probably very crude, but is this closer to what you were hoping for?

@epochs 10 Flux.train!(loss, params(model), data, opt, 
cb = eval(
if valid_loss< 0.1

I don’t think it’s exactly what you want, but it might give you something to work on?

If you want to break out of the entire training loop, there’s little point to using Flux.@epochs instead of a plain for loop as written in @anC’s first post (the former literally expands to the latter).

I agree with that the two suggestions are the same.