lsimplot works without time delay, but doesn’t seem to work with time delay??
using Plots, ControlSystems, DataInterpolations
sys = tf(,[0.5,1])*tf(,[3,1])#*delay(2)
tval() = rand(range(2,10,length=5))
uval() = rand(range(-1,1,length=5))
Nchanges = 20
tvec = [tval() for i in 1:Nchanges] |> cumsum
uvec = [uval() for i in 1:Nchanges]
uinterp = ConstantInterpolation(uvec,tvec)
u(x,t) = uinterp(t)
tfin = tvec[end]
gives the “experimental input”:
I can finally simulate the system with experiment:
tsim = range(0,tfin,length=200)
Here I have started with default states at zero.
If I change the line
sys = tf(,[0.5,1])*tf(,[3,1])#*delay(2) to
sys = tf(,[0.5,1])*tf(,[3,1])*delay(2), I can still do
stepplot of the system, but
lsimplot with input as above leads to zero output for all time…
- How can I make
lsim work with time delay?
- What initial states could I possibly choose when I have time-delay? Would it be reasonable to assume that
x0 refers to the states of the system without time delay, and that no states should be chosen for the (discretization of) time delay?