How to calculate loss in a given time budget?

Now i’d like to construct a Dataframe of loss values using different Optimizers
in a given processing time budget, which should looks like below.
But now my question is how to make this concrete?

│ Row │ Optimizer │ 1 min│ 2 mins │ 10 mins │ 20 mins │
├───┼───────┼────┼─────┼─────┼──────┤
│ 1 │ ADAM │ 1.2 │ 0.5 │ 0.32 │ 0.12 │
│ 2 │ BFGS │ 1.3 │ 0.3 │ 0.26 │ 0.07 │
│ 1 │ ADAMR │ 1.3 │ 0.4 │ 0.23 │ 0.12 │

I’m not sure what the question is: are you asking about the construction of a dataframe or about how to impose time constraints on different optimizers? If so, which packages are you using for fitting your models?

Thanks for replying.
I think there isn’t some special packages. I’m just using DiffeqFlux.sciml_train to tune the parameters of my ODEProblem for reducing loss value.
Yeah, the latter you said is right. Because now I want to use different optimizers like BFGS, ADAM etc to see how low can they get the loss function in a given time budget(1 min, 2mins, 10mins…). Do you have some idea about it?

You can just run a really long one, save in a callback the time for each few iterations, and then use that timeseries to get the values for the time to each error via rootfinding.

Right, this makes sense. But could you please give me more instructions?
Then how about making it in a function like form. But i can’t find good a way to
make data transmission concrete.
I can’t find a good way to exactly access my time budget. What i can do is
only using a stupid loop like below because i don’t know if it’s possible to pass
res_10.time_run attribute into callback as a judgment of whether to store the
interested loss value.
And all the code i’ve seen they just use normal callback in sciml_train, is it possible to
use other self-defined cb like VectorContinuousCallback() in sciml_train?

@time for i in 1:50
    global res_10
    global time
     res_10 = DiffEqFlux.sciml_train(
        p -> bac_10(p; abstol=1e-2, reltol=1e-2),
        res_10.minimizer,
        DiffEqFlux.ADAM(0.5),
        maxiters = 1,
        cb = basic_bac_callback
        )
        time += res_10.time_run
end

Just do time() inside of a callback and save the array of times, then do diff of the result. Do it all in the callback to cutoff compile time outliers.

Thanks! Now i totally understand what you mean.