I would like to use Optim.jl to perform a gradient descent, and for some reason would like after each step to be able to run some instruction, possibly modifying the current iterate. Does anyone know if this is possible and how to do such thing ?
Thanks for you reply ! I’ll dig into that shortly. Someone actually suggested on Optim.jl’s gitter to use the retract method which was introduced for performing manifold optimization, I’ll see if this actually allows to avoid using non-exposed code.