Using casadi from Julia

I am trying to use Casadi from Julia using the opti stack. I am trying to do this using PyCall. I get an error when I use the following code:

opti = casadi.Opti();

x = opti.variable();
y = opti.variable();

opti.minimize((1-x)^2+(y-x^2)^2);

opti.solver('ipopt');
sol = opti.solve();

plot(sol.value(x),sol.value(y),'o');

However, if I replace it with the following code the error disappears.

opti = casadi.Opti();

x = opti._variable();
y = opti._variable();

opti.minimize((1-x)^2+(y-x^2)^2);

opti.solver('ipopt');
sol = opti.solve();

plot(sol.value(x),sol.value(y),'o');

How, is opti._variable() different from opti.variable()? I could not find any documentation of _ usage in PyCall. Can anyone point me to it?

Note 1: I got to know about _ usage from example on GitHub - ichatzinikolaidis/CasADi.jl: Julia interface to CasADi via PyCall.

Note 2: I am trying to learn numerical optimal control and hence want to implement the algorithms given in Practical Methods for Optimal Control and Estimation Using Nonlinear ... - John T. Betts - Google Books. It seems that second order methods are better suited for this.

Does casadi offer some functionality you require and can’t find in the native Julia offerings, such as JuMP ?

3 Likes

I am using Casadi for numerical optimal control. Initially I was trying to use Jump. However, due to the issue mentioned here (Feature Request: Second Derivatives for User Defined Functions · Issue #1198 · jump-dev/JuMP.jl · GitHub) JuMP is not suitable for my application.

1 Like

How about using some native Julia packages for numerical optimal control such as GitHub - RoboticExplorationLab/TrajectoryOptimization.jl: A fast trajectory optimization library written in Julia or GitHub - JuliaMPC/NLOptControl.jl: nonlinear control optimization tool ?

1 Like

Second order optimization using Ipopt or the augmented Lagrangian algorithm are supported in GitHub - mohamed82008/Nonconvex.jl: Toolbox for non-convex constrained optimization.. I use AD for that.

1 Like

You may perhaps want to share the (minimal version of the) optimal control problem statement here, possibly accompanied by a casadi-based python code for its solving. It may happen that somebody here will be able to show the native Julia way to solve the problem.

I could use package like NLOptControl directly. However, I am trying to learn numerical optimal control and hence want to implement the algorithms given in Practical Methods for Optimal Control and Estimation Using Nonlinear … - John T. Betts - Google Books.

Thank you for the link. Looks good. I will try using it.

I think the closest to casadi in julia right now, as a DSL for control problems, would be a mix of ModelingToolkit and DiffEqFlux, being the former the “direct method” and the later the “indirect method”, following control terminology.

When I reproduce your error with my copy of casadi it points to an automatically generated file, but the distinction between variable and _variable comes from casadi’s internals. Not really sure why the get_frame call makes PyCall throw an error.

generated code from casadi
# This file was automatically generated by SWIG (http://www.swig.org).
# Version 3.0.11
...
    def _variable(self, *args):
        """
          Create a decision variable (symbol)

          _variable(self, int n, int m, str attribute) -> MX

        The order of creation matters. The order will be reflected in the
        optimization problem. It is not required for decision variables to actualy
        appear in the optimization problem.

        Parameters:
        -----------

        n:  number of rows (default 1)

        m:  number of columnss (default 1)

        attribute:  'full' (default) or 'symmetric'
        """
        return _casadi.Opti__variable(self, *args)
...
def variable(self,*args):
      import sys
      import os
      frame = sys._getframe(1)
      meta = {"stacktrace": {"file":os.path.abspath(frame.f_code.co_filename),"line":frame.f_lineno,"name":frame.f_code.co_name}}
      ret = self._variable(*args)
      self.update_user_dict(ret, meta)
      return ret

Just to clarify, JuMP will use second derivative information if you write out the problem algebraically.
It’s only if you write a user-defined function that we disable hessians.

Your first example in JuMP is:

using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, x)
@variable(model, y)
@NLobjective(model, Min, (1 - x)^2 + (y - x^2)^2)
optimize!(model)

I had initially tried JuMP. In particular the example given here. However, if I want to automate the procedure (the user will enter the dynamics and constraints on controls and states and the program will transcribe the problem in format suitable for JuMP) hessians of user defined functions would be required. There could be a workaround using macros but I am not proficient in Julia.

The Python version of the code uses the name “variable()”. However, I am not able to use the name “variable()” using PyCall.

using PyCall

cd = pyimport("casadi")

opti = cd.Opti()

x = py"$(opti).variable()"
y = opti.variable()

In this code the variable x is defined without error. But defining y throws an error.