Julia's DifferentialEquations package fails when using fortran-wrapped right-hand-side

I’m trying to solve a system of ordinary differential equations with Julia’s DifferentialEquations method. The right-hand-side of my ODEs is wrapped Fortran 90. Here is my Julia code:

using DifferentialEquations
function rhs(dNdt,N,p,t)
    ccall((:__atmos_MOD_rhs, "./EvolveAtmFort.so"), Cvoid,(Ref{Float64}, Ref{Float64},Ref{Float64}),t,N,dNdt)
end

N0 = [0.0,298.9,0.0562,22.9,0.0166,35.96,0.0,0.0,0.0,0.0]*6.022e23
tspan = [0.0,1.0e6*365.0*24.0*60.0*60.0]
prob = ODEProblem(rhs,N0,tspan)
sol = solve(prob,Rodas5());

This produces the following long error, that has to do with calculating the derivative/jacobian of the right-hand-side. Below, I only include some portions of the Stacktrace that seem important.

MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{SciMLBase.TimeGradientWrapper{ODEFunction{true,typeof(rhs),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Float64,1},SciMLBase.NullParameters},Float64},Float64,1})
Closest candidates are:
  Float64(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:200
  Float64(::T) where T<:Number at boot.jl:716
  Float64(::Float32) at float.jl:255
  ...

Stacktrace:
[1] convert(::Type{Float64}, ::ForwardDiff...
[2] Base.RefValue{Float64}(::ForwardDiff.Dual{...
[3] convert(::Type{Ref{Float64}}, ::ForwardDiff....
[4] cconvert(::Type{T} where T,...
[5] rhs(::Array{ForwardDiff.Dual{ForwardDiff.Tag{SciMLBase.TimeGradientWrapper...
[6] (::ODEFunction{true,typeof(rhs),LinearAlgebra...
[7] (::SciMLBase.TimeGradientWrapper{ODEFunction{true,typeof(rhs),LinearAlgebra.UniformScaling{Bool}...
[8] derivative!(::Array{Float64,1}, ::SciMLBase.TimeGradientWrapper...
[9] calc_tderivative!(::OrdinaryDiffEq.ODEIntegrator{Rodas5...
[10] calc_rosenbrock_differentiation!...
[etc...]

When I use a jacobian-free method, like Tsit5() , the integration works just fine. Only methods that require jacobian calculations fail. What am I doing wrong? How can I adjust my Fortran wrapper so that I can use implicit methods? Thanks! I’m using Julia 1.5.3 and DifferentialEquations

You’re not doing anything wrong, it’s just that the automatic Jacobian is provided by ForwardDiff.jl, which needs Julia code. It works by calling the rhs with a special Dual type (visible in your error message) that facilitates automatic differentiation. Unfortunately, the same doesn’t work with Fortran code.

It may be sufficient to do autodiff=false, which instructs the solver to use numerical finite differences, computed with your right-hand side.

But for better accuracy, you might supply an explicit Jacobian, say in Fortran or Julia. Or if you can provide the right-hand side in Julia, it could be automatically differentiated. If the Fortran code is just math, often the translation is quite straightforward, e.g. replace ** with ^.

9 Likes

Thanks! That makes sense. I have found that I can use implicit methods which are not written in Julia. For example, CVODE_BDF() or lsoda() works great. I’m guessing this is because there isn’t any special type conversions required (because CVODE is written in C with normal doubles).

EDIT:
Also, using sol = solve(prob,Rodas5(autodiff=false)); works as well.

I am blown away. I now understand. autodiff=True computes the analytical jacobian automatically?!? This is insane!

Yes, those methods don’t make use of automatic differentiation.

You should indeed be blown away. In the past I’ve had to use my own clunky finite difference, or used Mathematica to get a symbolic explicit Jacobian, inserted it as C code into a Matlab mex-file, then called from Matlab. This is commonly referred to as the THREE language problem :slight_smile: .

Now you can just supply the rhs as native Julia, and let the ODE solver handle the Jacobian for you. You might also use ModelingToolkit.jl to derive the equations in the first place, and never touch the code for rhs or its Jacobian!

2 Likes

If you want to learn more about how this works, you can check out the package that implements this Dual type: GitHub - JuliaDiff/ForwardDiff.jl: Forward Mode Automatic Differentiation for Julia

I also wrote an example of how to create your own automatic differentation tool in Julia here: DynamicWalking2018.jl/3. The Power of Multiple Dispatch.ipynb at master · rdeits/DynamicWalking2018.jl · GitHub . You wouldn’t actually want to do this in real code (you would just use ForwardDiff.jl), but it’s fun to explore and see how this sort of thing can be done particularly easily and efficiently in Julia.

3 Likes

And it can make it sparse, automatically multithread it, remove redundant equations to transform your equations to a smaller set, etc. You know the drill.

3 Likes

You should indeed be blown away. In the past I’ve had to use my own clunky finite difference, or used Mathematica to get a symbolic explicit Jacobian, inserted it as C code into a Matlab mex-file, then called from Matlab. This is commonly referred to as the THREE language problem :slight_smile: .

I’ve played the 3 language game before! Mathematica → Fortran/python+numba → python

I just told my research group about Julia’s auto differentiation capabilities. We study planetary atmospheres. All of them were blown away as well.

I’m starting realize something that I’m guessing most Julia users realize: Julia isn’t just “another hit programming language”. It’s really the best tool right now for scientific computing. Numba + python is great, but it it’s much less powerful than Julia.

4 Likes