I would like to use Optim.jl to perform a gradient descent, and for some reason would like after each step to be able to run some instruction, possibly modifying the current iterate. Does anyone know if this is possible and how to do such thing ?
Thanks in advance !
AFAICT this is not part of the exposed API, but I imagine you could iterate on
update_state! and do the transformation after each step.
You have to dig into the source code a bit though, and these functions can of course change without warning in each version.
Thanks for you reply ! I’ll dig into that shortly. Someone actually suggested on Optim.jl’s gitter to use the retract method which was introduced for performing manifold optimization, I’ll see if this actually allows to avoid using non-exposed code.