How to record loss during training with Flux.jl?

I want to record the training-loss, along with the test-loss, during a training process. I have to compute the test-loss seperatly from the training process, but I am thinking that I should be able to hook into the training machinery to access and save the train-loss.

But after spending some time with dead-end errors and non-terminating training loops, I need help - how can I record the training loss during training, without re-evaluating it?

1 Like

See at the very bottom of the ‘Training Models’ section of Flux’s documentation

There it suggests to use Zygote.pullback() for that purpose (and gives example code). Make sure though to read the whole section. This currently requires you to use a custom training loop instead of Flux.train!().

Btw, I can recommend https://github.com/JuliaLogging/TensorBoardLogger.jl to log this.

2 Likes