Hi,
Is there a way to print the function value while using any NLopt algorithm for parameter estimation while using build_loss_objective function to build the cost function?
Hi,
Is there a way to print the function value while using any NLopt algorithm for parameter estimation while using build_loss_objective function to build the cost function?
Please consider adding more context to your post, such as
build_loss_objective
come from?using DifferentialEquations, Plots, DiffEqParamEstim, Optimization, OptimizationMOI,
OptimizationNLopt, NLopt
function f(du, u, p, t)
du[1] = p[1] * u[1] - u[1] * u[2]
du[2] = -3 * u[2] + u[1] * u[2]
end
u0 = [1.0; 1.0]
tspan = (0.0, 10.0)
p = [1.5]
prob = ODEProblem(f, u0, tspan, p)
sol = solve(prob, Tsit5())
t = collect(range(0, stop = 10, length = 200))
randomized = VectorOfArray([(sol(t[i]) + 0.01randn(2)) for i in 1:length(t)])
data = convert(Array, randomized)
obj = build_loss_objective(prob, Tsit5(), L2Loss(t, data), Optimization.AutoForwardDiff())
opt = Opt(:LN_COBYLA, 1)
optprob = Optimization.OptimizationProblem(obj, [1.3])
res = solve(optprob, opt)
Thank you for your response @baggepinnen. I am following the example given here: Global Optimization via NLopt · DiffEqParamEstim.jl
I would like to see the verbose for the optimizer that is value of gradient and the objective function value
Add a callback to the optimization. They are defined in the docstring:
For example:
callback = function (state, l) #callback function to observe training
display(l)
return false
end
and then add callback = callback
to the kwargs of solve.
Thank you so much for helping me out @ChrisRackauckas