NLopt - Same complex clalculations in objetive function and nonlinear constraints

Thanks for the feedback! I had seen your post announcing Nonconvex (and the interesting subsequent discussion), but I’ve not taken the time to dig into using it. About the same days, somebody else pointed me to Julia Smooth Optimizers: too many things to read!

I had not thought about moving the objective as a constraint. This indeed removes the need for caching since NLopt supports vector constraints. However, I suspect this requires further investigation to see how each algorithm reacts. Indeed, this is a standard trick in Linear or Conic Programming (e.g. to transform a QP into SOCP if I remember well), but I’m not sure that all NLP solvers would react well to this apparent absence of objective.

About Nonconvex, I was about to say that in my usecase I need ForwardDiff(*), but I see deep in the doc that using ForwardDiff instead of Zygote is possible. Perhaps you can adapt your Getting started (in item “The use of Zygote.jl for automatic differentiation (AD) of the objective and constraint functions. Specifying analytic gradients is also possible…”) to reflect this. This also reminds me that I have to investigate ChainRules as well…

(*) or more truely: I’m pretty happy by ForwardDiff, very unsatisfied by ReverseDiff and I’ve yet to investigate more AD packages like Zygote. I’ve less than ten inputs and outputs.

1 Like