Passing gradient to Optim

i have the following code:

solution = Optim.minimizer(optimize(θ->J(X,θ),θ,g!,LBFGS())))

where X,θ are arrays and g! is the gradient function. I get the following error only when trying to specify the gradient function g!
MethodError: no method matching optimize(::getfield(Main, Symbol(“##174#175”)), ::Array{Float64,1}, ::typeof(g!), ::LBFGS{Nothing,LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol(“##19#21”))})

Any suggestions?

It looks like you exchanged the places of θ and g! and it should instead be
solution = Optim.minimizer(optimize(θ->J(X,θ),g!,θ,LBFGS())))

See also the manual.

ive tried different argument positions but the result is the same

Can you post a minimal (non)working example of you code?

I had a similar question, maybe this helps:

https://github.com/JuliaNLSolvers/NLSolversBase.jl/issues/106

2 Likes