@constraint macro vs autodiff vs provide gradient

Hello,

I have been using Julia for about 3 weeks. A lot still to learn.

In my optimization problem, I need as much speed as possible (using Grey Wolf Optimizer with many NLP calls). I am aware that, as mentioned in JuMP documentation, it is better to provide gradients than using “register(model, :my_f, ‘number of arguments’, my_f, autodiff=true)” when dealing with problems with many variables (which is my case) and applying user-defined functions to write the constraints and the objective function. In the point I am right now, I have the following options:

1- Simply use the @constraint and @NLconstraint macros to write my constraints;

2- Define a function with a single argument “my_f(x)” (where x will actually receive an array with all my variables), write my constraint inside the function according to the necessary indexes of x (e.g., my_f(x) = x[1]+x[7]-2), and use “register(model, :my_f, 1, my_f, gradient_my_f, hessian_my_f); @constraint(model, my_f(x)==0)”;

3- Define a function with the number of arguments equal to the number of variables in the constraint (e.g., my_f(x,y)=x+y-2), and use “register(model, :my_f, 2, my_f, gradient_my_f); @constraint(model, my_f(x[1],x[7])==0)”;

(The constraint above is merely an example!)

Which one should provide the fastest optimization? I know option 2 does not work with nonlinear constraints since these do now allow arrays as arguments for the user-defined function. What I really want to know is if there is an advantage in defining my own function and providing gradients and hessians (this last only if option 2) compared to using the @constraint and @NLconstraint macros to write my constraints.

Please assume my constraints are supported by JuMP. In case this issue relates more to the solving package than to JuMP, I am using Ipopt.

Thank you for your time!

Regards.

If it is possible to write your problem in an arithmetic form, then I would recomend using these macros. If the function is linear or quadratic you should always use @constraint in stead of @NLconstraint. This increases derivative computation speed.

My experience has been that it is challenging to beat JuMP’s out of the box performance on derivative computation. I only use registered functions when an arithmetic form of the constraint is not possible.

1 Like

All my constraints can be written arithmetically. In that case, I guess I cannot make it faster than it currently is. Thank you for your reply.