Optim.jl : first-order (and second-order) optimization without providing an objective function

I asked a similar question here: How can I use Optim/BFGS on functions I can't evaluate but whose gradients I can compute? - #11 by marius311

In the end I settled on using a zero-finder on the gradient, specifically Broyden.

One caveat to note is that a zero-finder doesn’t assume the gradient vector is a conservative vector field, even though you secretly know that it is since its a gradient. This generally means these algorithms won’t be exactly equivalent to a hypothetical optimizer that ignores the objective. That said this might just be an academic point, at least on my problems I didn’t find a big difference in practice.

I do wish there was an easy way to have Optim use a fixed step size and to never try and evaluate the object. It seems that LineSearches.Static exists for fixed step size, but I don’t think Optim has an option to never use the objective, at least last time I checked. It would be nice to have that.

1 Like