Using ForwardDiff to differentiate an ODE with respect to some of its parameters

So far, I have this


using DifferentialEquations

using LabelledArrays

const n = 10

struct params

    p_to_optimize_array::AbstractArray{<:Real,1}

    p_constant_array::AbstractArray{<:Real,1}

end

function f1(du,u,p,t)

    du.x .= -1 .*u.x * p.p_constant_array.d

    du.y .= -1 .* u.y .* p.p_to_optimize_array.a * p.p_constant_array.e

end

function objective(p)

    p_to_optimize_array = @LArray (p) (:a,:b,:c)

    u_0 = @LArray eltype(p) (2*n) (x = (1:n),y = (n+1:2*n))

    u_0 .= 1000.0

    p_constant_array =  @LArray [1.0,2.0] (:d,:e)

    param_struct = params(p_to_optimize_array,p_constant_array)

    prob1 = ODEProblem(f1,u_0,(0,100.0),param_struct)

    sol = sum(solve(prob1, Rodas5()).u)
    return sum(sol)

end

using ForwardDiff

ForwardDiff.gradient(objective,[1.0,2.0,3.0])

the error I get is ForwardDiff.DualMismatchError, which I figure is because u_0 and p are both Dual. However, if I don’t set the type of u_0 to be eltype(p), but rather Float64 or something, then the ODE solver won’t accept the dual numbers as its state. I know DiffEqParamEstim.jl implements this sort of thing, but I want to understand how to use it myself.

Does sol = solve(prob1, Rodas5(autodiff=false)) work?

No, that gives a different error MethodError: no method matching extract_gradient!..... Using a non-stiff solver like Tsit5() gives the same error.

Your output isn’t a scalar, but you’re asking for the gradient. Did you mean to end with sum(sol) or something like that?

That’s embarassing, yeah I had a loss function there in the larger project that this example is being used to prototype for.

Yeah the error goes away with autodiff = false or a non-stiff solver.

Yes nesting autodiff here gets a little bit tricky, but that gets you working for now.

2 Likes