I’m using Optim and the BFGS algarithm in order to minimize a function. In order to speed up the minimization I want to provide the gradient of the objective function. However both, the objective function as well as the gradient depends on some constant parameters. I know, how to pass the constant parameters for objective function by

I solved the problem partially. If I use the following code

using Optim
using Plots
function f(x, p)
return (x[1] - p[1])^2 + (x[2] - p[2])^2
end
function g!(x, G, p)
G[1] = 2 * (x[1] - p[1])
G[2] = 2 * (x[2] - p[2])
end
function fit()
p = [3, 6]
initial_x = rand(2)
res = optimize(x -> f(x, p), initial_x, BFGS(), Optim.Options(show_trace = true) )
return res, res2
end

it works fine now. I still can save computation if I would use the the function “only_fg!(fg!)” (Optim → only_fg!). (Of course not in this example, but in my calculations later.)

function fg!(F,G,x,p)
# common computations not done in this example
if G != nothing
G[1] = 2 * (x[1] - p[1])
G[2] = 2 * (x[2] - p[2])
end
if F != nothing
return (x[1] - p[1])^2 + (x[2] - p[2])^2
end
end

Unfortunately, I cannot differentiate them by hand. But I can approximate how the objective function is affected by a certain parameter and so, for numerical differentiation, I have to calculate only some parts of the objective function.