Float64 error in plotting

Hi, I am new to SciML and Julia. I am trying to use neural ODE to model the system. Everything else in my code except the plotting works.
The problem occurs when I try to plot the prediction vs the original using scatter plot

This is the message I am getting :
ERROR: Cannot convert Float64 to series data for plotting

Below is the code :

using ComponentArrays, Lux, DiffEqFlux, OrdinaryDiffEq, Optimization, OptimizationOptimJL,
      OptimizationOptimisers, Random, Plots
function sir_ode(du,u,p,t)
    S,I,R = u
    b,g,N = p
    du[1] = -b*S*I/N
    du[2] = b*S*I/N-g*I
    du[3] = g*I
end
N=1000.0 #total population
I0=1.0 #initial infected
R0=0.0 #initial recovered
b=0.3 #transmission rate
g=0.1 #recovery rate
S0=N-I0-R0 #initial susceptible
parms = [b,g,N]
init = [S0,I0,R0]
tspan = [0.0,160.0] #timespan
tsteps= 0.1 #timesteps
#Use ODE to solve the SIR problem
sir_prob = ODEProblem(sir_ode,init,tspan,parms) #define the ODE
sir_sol = solve(sir_prob,Tsit5(),saveat = tsteps)
plot(sir_sol,xlabel="Days",ylabel="Number")

# make an Array for ode solution
ode_data = Float32.(Array(sir_sol))

#define Neural Network (NN)
rng = Xoshiro(0) #seeding
dudt = Chain(x -> x, Dense(3, 30, tanh), Dense(30, 3)) # model name is du/dt
reltol = 1e-7
abstol = 1e-9
# Parameter and State Variables
p, st = Lux.setup(rng, dudt)
sir_neuralode = NeuralODE(dudt, tspan, Tsit5(), saveat = tsteps,reltol=reltol,abstol=abstol)

# create the neural ODE output data
function predict_neuralode(p)
    Float32.(Array(sir_neuralode(init, p, st)[1]))
end
#check the loss ; difference between oroginal & NN prediction 
function loss_neuralode(p)
    pred = predict_neuralode(p)
    loss = sum(abs2, ode_data .- pred)
    return loss, pred
end

# A Callback function to plot the comparison between original ODE vs NN solution for suspectible, infected and recovered individuals
callback = function (p, l, pred; doplot=true)
    println(l)
    if doplot
        splt = scatter(tsteps, ode_data[1, :]; label="true suspectible")
        scatter!(splt, tsteps, pred[1, :]; label="prediction suspectible")

        #=iplt = scatter(tsteps, ode_data[2, :]; label="true infected")
        scatter!(iplt, tsteps, pred[2, :]; label="prediction infected")

        rplt = scatter(tsteps, ode_data[3, :]; label="true recovered")
        scatter!(rplt, tsteps, pred[3, :]; label="prediction recovered")

        display(plot(splt, iplt, rplt))=#
        display(plot(splt))
    end
    return false
end

# Defining optimization techniques
pinit = ComponentArray(p)
adtype = Optimization.AutoZygote()
optimizeFunction = Optimization.OptimizationFunction((x, p) -> loss_neuralode(x), adtype)

# Defining the problem to be optimized
neuralProblem = Optimization.OptimizationProblem(optimizeFunction, pinit)

# NN solver that iterates over 3000 using ADAM optimizer
result_neuralode = Optimization.solve(
    neuralProblem, OptimizationOptimisers.Adam(0.05); callback = callback, maxiters = 100)

callback(result_neuralode.u, loss_neuralode(result_neuralode.u)...; doplot = true)```

tsteps = 0.1 is just a number. It can’t use this as a plotting axis. I’m guessing you want to read the times from sir_sol and plot using those instead. You’ll have to look around in there to figure out exactly where they are stored (or maybe someone else can tell you).

1 Like

Thanks, I will do that.

@mikmoore
Your suggestion fixed my problem
I made tsteps into an array and it worked