Conjugate Gradient optimization with set of functions

The issue is just that Optim passes you an array rather than individual parameters, so you could just do something like:

f(betas) = h(betas...,1,1,1,1,1)

optimize(
    f,
    ones(6), 
    method = ConjugateGradient(),
    autodiff = :forward,
    x_tol = 1e-7
)

I looked up the Optim automatic differentiation API and its as easy as adding that autodiff parameter, so no need to hand-code gradients.

1 Like