Number of times loss function is called

Hi,

In the Lotka-Volterra system, I have placed:


function loss(θ)
    println("loss is called")
    X̂ = predict(θ)
    mean(abs2, Xₙ .- X̂)
end

losses = Float64[]

callback = function (p, l)
    push!(losses, l)
    if length(losses) % 10 == 0
        println("Current loss after $(length(losses)) iterations: $(losses[end])")
    end
    return false
end

adtype = Optimization.AutoZygote()
optf = Optimization.OptimizationFunction((x, p) -> loss(x), adtype)
optprob = Optimization.OptimizationProblem(optf, ComponentVector{Float64}(p))

res = Optimization.solve(optprob, ADAM(), callback = callback, maxiters = 10)

This is printing 20 times “loss is called”.

Why? If I am asking that maxiters = 10 ?

Best Regards

I noticed this behavior recently and had the same question. I suspect this is a result of the optimizer checking the gradient of the loss function. You might get a better sense of what’s going on under the hood by printing some additional information about the current value of x (or some subset of x[]) being passed to your loss function.

Yeah it’s because of the loss function being called twice - once directly and once in the gradient calculation. If you want to count only the iterations you can put the counter in the callback function

What is the purpose of calling println in the calculation of the gradient? Will println functions have an influence on the calculation of the gradient?

No, the loss function get evaluated when computing the gradient hence it gets printed. It wouldn’t have any effect