How do you add time delays to the external model inputs using ModelingToolkit and DDEProblem?
In the example below, adding a time delay to the input Q, gives the error “KeyError: key Q not found”
using ModelingToolkit, DelayDiffEq
using ModelingToolkit: t_nounits as t, D_nounits as D
tau = 1;
@parameters p0=0.2 p1=0.2 q0=0.3 q1=0.3 v0=1 v1=1 d0=5 d1=1 d2=1 beta0=1 beta1=1
@variables x₀(t) x₁(t) x₂(..) Q(..)
eqs = [D(x₀) ~ (v0 / (1 + beta0 * (x₂(t - tau)^2))) * (p0 - q0) * x₀ - d0 * x₀ + Q(t-tau)
D(x₁) ~ (v0 / (1 + beta0 * (x₂(t - tau)^2))) * (1 - p0 + q0) * x₀ +
(v1 / (1 + beta1 * (x₂(t - tau)^2))) * (p1 - q1) * x₁ - d1 * x₁
D(x₂(t)) ~ (v1 / (1 + beta1 * (x₂(t - tau)^2))) * (1 - p1 + q1) * x₁ - d2 * x₂(t)
Q(t-tau) ~ u_Q(t)
]
@mtkbuild sys = System(eqs, t)
@register_symbolic u_Q(t)
u_Q(t) = (t < 5) ? 1 : 10
tspan = (0.0, 10.0)
prob = DDEProblem(sys,
[x₀ => 1.0, x₁ => 1.0, x₂(t) => 1.0],
tspan,
constant_lags = [tau])
alg = MethodOfSteps(Tsit5())
sol = solve(prob, alg)
However, removing the time delay from the external model input, Q, works fine:
using ModelingToolkit, DelayDiffEq
using ModelingToolkit: t_nounits as t, D_nounits as D
tau = 1;
@parameters p0=0.2 p1=0.2 q0=0.3 q1=0.3 v0=1 v1=1 d0=5 d1=1 d2=1 beta0=1 beta1=1
@variables x₀(t) x₁(t) x₂(..) Q(t)
eqs = [D(x₀) ~ (v0 / (1 + beta0 * (x₂(t - tau)^2))) * (p0 - q0) * x₀ - d0 * x₀ + Q
D(x₁) ~ (v0 / (1 + beta0 * (x₂(t - tau)^2))) * (1 - p0 + q0) * x₀ +
(v1 / (1 + beta1 * (x₂(t - tau)^2))) * (p1 - q1) * x₁ - d1 * x₁
D(x₂(t)) ~ (v1 / (1 + beta1 * (x₂(t - tau)^2))) * (1 - p1 + q1) * x₁ - d2 * x₂(t)
Q ~ u_Q(t)
]
@mtkbuild sys = System(eqs, t)
@register_symbolic u_Q(t)
u_Q(t) = (t < 5) ? 1 : 10
tspan = (0.0, 10.0)
prob = DDEProblem(sys,
[x₀ => 1.0, x₁ => 1.0, x₂(t) => 1.0],
tspan,
constant_lags = [tau])
alg = MethodOfSteps(Tsit5())
sol = solve(prob, alg)
Does anyone have a suggestion on how to handle time delays in model inputs?