Integro-differential equations with DifferentialEquations.jl

The integro-differential equations I need to solve are of this form: http://mathb.in/30204. While I already have written a solver for a simple test problem (s. code below), the type of equation shown in the link above can become very costly because of matrix multiplications. In short, it would be great to have some input on how to implement this efficiently with DifferentialEquations.jl.

When building up the ‘memory integral’ on the right-hand side, it would be most efficient to save the value of the integral for the previous time steps, then perform one matrix multiplication for the next time step and add that to the existing sum over the past. While it is kind of obvious how to code this from scratch, I cannot see how to achieve a similar thing either as an ODEProblem or a DDEProblem.

Test problem: https://en.wikipedia.org/wiki/Integro-differential_equation

const alpha = 2; const beta = 5; 
const u_0 = 0.0
const t_0 = 0.0
const T = 5.0
const n = 1000

const lags = range(delta_t, length=n, stop=(T-t_0)*(1-1/n))

# left Riemann sum
function f(du, u, h, p, t)
    du[1] = 1 - alpha*u[1] - beta*((T-t_0)/n)*sum([h(p, t-tau)[1] for tau in lags])
end

h(p, t) = zeros(1)
tspan = (0.0, T)
u0 = [u_0]
prob = DDEProblem(f, u0, h, tspan)
algo = MethodOfSteps(Tsit5())
prob = DDEProblem(f, u0, hist, tspan)
1 Like

Okay, this approach via an ODEProblem should do the trick:

# including the summed-up memory integral as a parameter
function f(du, u, p, t)
    du[1] = 1 - alpha*u[1] - p[1]
end

data = zeros(Float64, n+1)
history = 0.::Float64
for i in 2:n+1
    history += beta*((T-t_0)/n)*data[i-1]
    prob = ODEProblem(f, [data[i-1]], (0., delta_t), [history])
    integrator = init(prob, Tsit5(), delta_t)
    step!(integrator)
    data[i] = integrator.u[1]
end

It really depends on the matrix \Sigma_{ij}. If it is like \Sigma(t) = \sum_k \alpha_kt^ke^{-\lambda_kt}, then you can recast your problem as an ODE.

1 Like

This will only update the history at the intermediate steps, but it’s still a decent approximation. Note that you don’t need to recreate the integrator every step: instead just update integrator.p and step! again. That would be more efficient.

1 Like

For the trivial example here, it’s true that I could create the integrator outside the loop! For the full thing I am working on, however, I’d have to keep as many integrators in memory as I have columns in my two-time matrices, for which reason I still believe that it is better to simply have one and recreate it on each step?

Maybe briefly regarding two-time matrices: When solving these Kadanoff-Baym-like equations, one has two time directions, one vertical and the other horizontal. I take steps in the vertical direction to calculate the lower triangle (including the diagonal) and use certain symmetry relations to get the upper triangle.