Hello anyone reading this, sorry if there are some formatting issues. First post and all. Promise I’ll learn. Anyways, a lot of the work I do involves optimal control for spacecraft using indirect methods, so I would like to solve differential equations fast. This is obviously the major draw of Julia for me. My experience with Julia in this department has been pretty good so far, I have learned a lot from SciMLTutorials.jl and from solving several simple problems on my own. Recently, I wrote the following code.
using LinearAlgebra
using Symbolics
function generate_eoms()
@variables p f g h k L m # state variables
@variables p1 p2 p3 p4 p5 p6 p7 # costates
@variables ur ut un delta # controls
@variables c Thr rho SI2CAN t # extra parameters
pd = c, Thr, rho, SI2CAN
# Form state vectors
S = [p;f;g;h;k;L;m]
P = [p1;p2;p3;p4;p5;p6;p7]
X = [S; P]
# Get A matrix
SinL = sin(L)
CosL = cos(L)
q = 1+f*CosL+g*SinL
s = 1+h^2+k^2
C1 = sqrt(p)
C2 = 1/q
C3 = h*SinL-k*CosL
A = [ 0.0 2*p*C2*C1 0.0;
C1*SinL C1*C2*((q+1)*CosL+f) -C1*(g/q)*C3;
-C1*CosL C1*C2*((q+1)*SinL+g) C1*(f/q)*C3;
0.0 0.0 C1*s*CosL*C2/2;
0.0 0.0 C1*s*SinL*C2/2;
0.0 0.0 C1*C2*C3;
0.0 0.0 0.0]
# Get vector optimal control
Pv = transpose(A)*P
normPv = sqrt(Pv[1]*Pv[1] + Pv[2]*Pv[2] + Pv[3]*Pv[3])
U_op = -Pv/normPv
# Get scalar optimal control
SF = c*SI2CAN*normPv/m+p7
delta_op = 0.5*(1+tanh(SF/rho))
Delta = Thr/m*SI2CAN*delta_op*U_op
# Get acceleration terms
b = [0.0; 0.0; 0.0; 0.0; 0.0; sqrt(p)*(q/p)^2; -Thr/c*delta_op]
dy = A*Delta+b
# Define Cost
Lagrangian = 0.0
# Hamiltonian is a scalar
Hamiltonian = Lagrangian + dot(P,dy)
# Here, we define the costates as the negative gradient of the Hamiltonian
costates = -Symbolics.gradient(Hamiltonian, S)
# Full nonlinear system of equations here
F = [dy;costates]
return eval(build_function(F, X, pd, t)[2])
end
Since the research I do involves solving different problems for lots of complex dynamical systems, it’s nice to be able to symbolically derive the governing equations since half of them are nearly impossible to derive by hand (the “costates,” which are derived identically to the time rate of change of momenta in Hamiltonian mechanics).
The goal of the function that I wrote is to use the physical dynamics to derive the costate dynamics and then generate an optimized function to calculate both that can be integrated with DifferentialEquations.jl. However, the code shown above produces a function that takes an incredibly long time to evaluate just once (on the order of minutes), let alone integrate, so I am obviously doing something wrong here. The strange thing is, I have used this exact process in Julia for different dynamical systems and it has worked very well! But not this time.
I have attempted to take a different approach using the ForwardDiff.jl package like so:
using ForwardDiff
xDot!(dX,ham,X,P,probdata,t) = ForwardDiff.jacobian!(dX, P-> hamiltonian(ham,X,P,probdata), P)
pDot!(dP,ham,X,P,probdata,t) = ForwardDiff.jacobian!(dP, X->-hamiltonian(ham,X,P,probdata), X)
The Hamiltonian function is nearly identical to generate_eoms(), but just calculates the value of the Hamiltonian. This approach works better (and is a lot nicer to look at), but the resulting functions are still pretty slow. The only thing I have been able to do to speed up the evaluation of the generated function is to avoid some of the substitutions, write the code to its own file, and then manually add the lines that I avoided substituting to the new, written file. This is obviously not ideal, but it is very fast. The code written by build_function() is extremely lengthy otherwise.
My main questions: What is the best way to programmatically generate the differential equations of a Hamiltonian system in an efficient form and then integrate them? Where did I go wrong here?
Please leave any and all criticisms of my code/questions/person. Please leave them as soon as you have a chance, my advisor is breathing down my neck.
Thanks! - Sam