I am trying to use Casadi from Julia using the opti stack. I am trying to do this using PyCall. I get an error when I use the following code:
opti = casadi.Opti();
x = opti.variable();
y = opti.variable();
opti.minimize((1-x)^2+(y-x^2)^2);
opti.solver('ipopt');
sol = opti.solve();
plot(sol.value(x),sol.value(y),'o');
However, if I replace it with the following code the error disappears.
opti = casadi.Opti();
x = opti._variable();
y = opti._variable();
opti.minimize((1-x)^2+(y-x^2)^2);
opti.solver('ipopt');
sol = opti.solve();
plot(sol.value(x),sol.value(y),'o');
How, is opti._variable() different from opti.variable()? I could not find any documentation of _ usage in PyCall. Can anyone point me to it?
I am using Casadi for numerical optimal control. Initially I was trying to use Jump. However, due to the issue mentioned here (https://github.com/jump-dev/JuMP.jl/issues/1198) JuMP is not suitable for my application.
You may perhaps want to share the (minimal version of the) optimal control problem statement here, possibly accompanied by a casadi-based python code for its solving. It may happen that somebody here will be able to show the native Julia way to solve the problem.
I think the closest to casadi in julia right now, as a DSL for control problems, would be a mix of ModelingToolkit and DiffEqFlux, being the former the “direct method” and the later the “indirect method”, following control terminology.
When I reproduce your error with my copy of casadi it points to an automatically generated file, but the distinction between variable and _variable comes from casadi’s internals. Not really sure why the get_frame call makes PyCall throw an error.
generated code from casadi
# This file was automatically generated by SWIG (http://www.swig.org).
# Version 3.0.11
...
def _variable(self, *args):
"""
Create a decision variable (symbol)
_variable(self, int n, int m, str attribute) -> MX
The order of creation matters. The order will be reflected in the
optimization problem. It is not required for decision variables to actualy
appear in the optimization problem.
Parameters:
-----------
n: number of rows (default 1)
m: number of columnss (default 1)
attribute: 'full' (default) or 'symmetric'
"""
return _casadi.Opti__variable(self, *args)
...
def variable(self,*args):
import sys
import os
frame = sys._getframe(1)
meta = {"stacktrace": {"file":os.path.abspath(frame.f_code.co_filename),"line":frame.f_lineno,"name":frame.f_code.co_name}}
ret = self._variable(*args)
self.update_user_dict(ret, meta)
return ret
Just to clarify, JuMP will use second derivative information if you write out the problem algebraically.
It’s only if you write a user-defined function that we disable hessians.
Your first example in JuMP is:
using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, x)
@variable(model, y)
@NLobjective(model, Min, (1 - x)^2 + (y - x^2)^2)
optimize!(model)
I had initially tried JuMP. In particular the example given here. However, if I want to automate the procedure (the user will enter the dynamics and constraints on controls and states and the program will transcribe the problem in format suitable for JuMP) hessians of user defined functions would be required. There could be a workaround using macros but I am not proficient in Julia.