# Optim - provide gradient with fixed parameters

I’m using Optim and the BFGS algarithm in order to minimize a function. In order to speed up the minimization I want to provide the gradient of the objective function. However both, the objective function as well as the gradient depends on some constant parameters. I know, how to pass the constant parameters for objective function by

``````optimize(x -> mse(x, p), start_guess, BFGS() )
``````

How can I do the same with the gradient function?

1 Like

Why not in the same way?

``````optimize(x -> mse(x, p), x-> g(x,p), start_guess, BFGS() )
``````

but its not working.

1 Like

I solved the problem partially. If I use the following code

``````using Optim
using Plots

function f(x, p)
return (x[1] - p[1])^2 + (x[2] - p[2])^2
end

function g!(x, G, p)
G[1] = 2 * (x[1] - p[1])
G[2] = 2 * (x[2] - p[2])
end

function fit()
p = [3, 6]
initial_x = rand(2)
res = optimize(x -> f(x, p), initial_x, BFGS(), Optim.Options(show_trace = true) )
return res, res2
end
``````

it works fine now. I still can save computation if I would use the the function “only_fg!(fg!)” (Optim -> only_fg!). (Of course not in this example, but in my calculations later.)

``````function fg!(F,G,x,p)
# common computations not done in this example
if G != nothing
G[1] = 2 * (x[1] - p[1])
G[2] = 2 * (x[2] - p[2])
end
if F != nothing
return (x[1] - p[1])^2 + (x[2] - p[2])^2
end
end
``````

However, calling this function by

``````optimize( Optim.only_fg!(x->fg!(x, [3,5])), [0., 0.], LBFGS())
``````

is not working.

No, because you should write

``````(F, G, x)->fg!(F, G, x, [3,5])
``````
3 Likes

Why don’t you use the (built-in?) automatic differentiation?

1 Like

Thanks a lot for this clear answer. This of course works.

Unfortunenately, for the objective function I have to calculate eigenvalues and eigenvectors and automatic differentiation cannot be applied.

But you can differentiate them by hand ？

Unfortunately, I cannot differentiate them by hand. But I can approximate how the objective function is affected by a certain parameter and so, for numerical differentiation, I have to calculate only some parts of the objective function.