UndefRefError while running code based on "Automatically Discover Missing Physics by embedding Machine Learning into Differential Equations"

Hi there, first time posting a question on this forum, I hope I’m doing this right.

I am currently trying to modify the code posted in the documentation Automatically Discover Missing Physics by Embedding Machine Learning into Differential Equations and am running into a problem during the Training step.

In my case, I am trying to model the lung dynamics and train it with measurement data (Inserted as string representation in code). The lung model contains the parameters: elastance and resistance (Only elastance for now, but hopefully both in the future) which I would like to represent with a neural network.

The code begins with data importing and formatting, before moving onto the implementation of the UDE:

# Packages used

using CSV, DataFrames, Interpolations

# SciML Tools
using OrdinaryDiffEq, ModelingToolkit, DataDrivenDiffEq, SciMLSensitivity, DataDrivenSparse
using Optimization, OptimizationOptimisers, OptimizationOptimJL, LineSearches

# Standard Libraries
using LinearAlgebra, Statistics

# External Libraries
using ComponentArrays, Lux, Zygote, Plots, StableRNGs
gr()

# Read the data from string_representation
string_representation = "Volume,Flow,Pressure,Time\n0.00482662242664118,-0.0107790538650184,5.25878787878788,3.88097115976564\n0.00480324768368645,0.00610473696873567,5.44,3.89097251118164\n0.00496200506609732,0.0256508319193269,5.64787878787879,3.90097122246094\n0.00533109551801634,0.048163188241624884,5.89393939393939,3.9109717738769\n0.00594108446420087,0.073816653042622,6.14363636363636,3.92097324526368\n0.00681941452753682,0.101875515282502,6.40424242424242,3.93097175654292\n0.00798892943874884,0.131946296081982,6.67818181818182,3.94097522802735\n"
data = CSV.read(IOBuffer(string_representation), DataFrame)
t_data = data.Time
start_time = t_data[1]
adjusted_t_data = t_data .- start_time
measured_volume_data = data.Volume
measured_flow_data = data.Flow
measured_pressure_data = data.Pressure .- 5.2

# Simulation parameters
volume_initial = measured_volume_data[1]   # Initial lung volume
flow_initial = measured_flow_data[1]       # Initial lung flow
tspan = (0.0, adjusted_t_data[end])  # Time span for simulation
initial_condition = [volume_initial, flow_initial]
p_ = [0.1, 9.52]

# Interpolation function for measured data
pressure_interp = interpolate((adjusted_t_data,), measured_pressure_data, Gridded(Linear()))

# Define neural network (Input of volume and flow, output of elastance)
const U = Lux.Chain(
    Lux.Dense(2, 10, sigmoid),
    Lux.Dense(10, 10, sigmoid),
    Lux.Dense(10, 1)
)

# Set a random seed for reproducible behaviour
rng = StableRNG(1111)

p, st = Lux.setup(rng, U)
const _st = st

# UDE lung model. Only 1 DE, as a definition for the derivative of flow doesnt exist. But state vector is defined as volume, flow to allow for input of initial_conditions. I am aware that flow is not available in the solution of the ODEProblem this way.
function lung_dynamics!(du, u, p, t, p_true)

    volume,flow = u
    pressure = pressure_interp(t)

    û = U(u, p, _st)[1]      # Network prediction (elastance)

    du[1] = (pressure - u[1] * û[1]) / p_true[2]    #Equation of motion. Flow = (pressure - volume * elastance) / resistance

end

# Closure with the known parameter
nn_dynamics!(du, u, p, t) = lung_dynamics!(du, u, p, t, p_)
# Define the problem
prob_nn = ODEProblem(nn_dynamics!, initial_condition, tspan, p)

function predict(θ, X = initial_condition, T = tspan)
    _prob = remake(prob_nn, u0 = X, tspan = (T[1], T[end]), p = θ)
    Array(solve(_prob, Vern7(), saveat = adjusted_t_data,
                abstol = 1e-6, reltol = 1e-6,
                sensealg=QuadratureAdjoint(autojacvec=ReverseDiffVJP(true))))
end

function loss(θ)
    X̂ = predict(θ)
    mean(abs2, measured_volume_data .- X̂[1,:])
end

losses = Float64[]

callback = function (p, l)
    push!(losses, l)
    if length(losses) % 50 == 0
        println("Current loss after $(length(losses)) iterations: $(losses[end])")
    end
    return false
end

adtype = Optimization.AutoZygote()
optf = Optimization.OptimizationFunction((x, p) -> loss(x), adtype)
optprob = Optimization.OptimizationProblem(optf, ComponentVector{Float64}(p))
res1 = Optimization.solve(optprob, OptimizationOptimisers.Adam(), callback = callback, maxiters = 2)

Running the last line produces the following error message and the stacktrace (Middle section partly removed due to character limit):

UndefRefError: access to undefined reference

{
	"name": "UndefRefError",
	"message": "UndefRefError: access to undefined reference",
	"stack": "UndefRefError: access to undefined reference

Stacktrace:
  [1] getindex
    @ .\\essentials.jl:13 [inlined]
  [2] increment_deriv!
    @ C:\\Users\\minyo\\.julia\\packages\\ReverseDiff\\p1MzG\\src\\derivatives\\propagation.jl:35 [inlined]
  [3] increment_deriv!
    @ C:\\Users\\minyo\\.julia\\packages\\ReverseDiff\\p1MzG\\src\\derivatives\\propagation.jl:40 [inlined]
  [4] _vecjacobian!(dλ::Vector{Float64}, y::Vector{Float64}, λ::Vector{Float64}, p::ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, t::Float64, S::SciMLSensitivity.ODEQuadratureAdjointSensitivityFunction{SciMLSensitivity.AdjointDiffCache{Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, 
    @ SciMLSensitivity C:\\Users\\minyo\\.julia\\packages\\SciMLSensitivity\\rXkM4\\src\\derivative_wrappers.jl:505
  [5] #vecjacobian!#18
    @ C:\\Users\\minyo\\.julia\\packages\\SciMLSensitivity\\rXkM4\\src\\derivative_wrappers.jl:231 [inlined]
  [6] vecjacobian!
    @ C:\\Users\\minyo\\.julia\\packages\\SciMLSensitivity\\rXkM4\\src\\derivative_wrappers.jl:228 [inlined]
 [10] auto_dt_reset!
    @ C:\\Users\\minyo\\.julia\\packages\\OrdinaryDiffEq\\ZbQoo\\src\\integrators\\integrator_interface.jl:453 [inlined]
 [11] handle_dt!
 [13] __init (repeats 5 times)
    @ C:\\Users\\minyo\\.julia\\packages\\OrdinaryDiffEq\\ZbQoo\\src\\solve.jl:11 [inlined]
 [14] #__solve#761
    @ C:\\Users\\minyo\\.julia\\packages\\OrdinaryDiffEq\\ZbQoo\\src\\solve.jl:6 [inlined]
 [15] __solve
    @ C:\\Users\\minyo\\.julia\\packages\\OrdinaryDiffEq\\ZbQoo\\src\\solve.jl:1 [inlined]
 [16] solve_call(_prob::ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, ODEFunction{true, true,
    @ DiffEqBase C:\\Users\\minyo\\.julia\\packages\\DiffEqBase\\yM6LF\\src\\solve.jl:612
 [17] solve_call
    @ C:\\Users\\minyo\\.julia\\packages\\DiffEqBase\\yM6LF\\src\\solve.jl:569 [inlined]
 [18] #solve_up#53
    @ C:\\Users\\minyo\\.julia\\packages\\DiffEqBase\\yM6LF\\src\\solve.jl:1080 [inlined]
 [19] solve_up
    @ C:\\Users\\minyo\\.julia\\packages\\DiffEqBase\\yM6LF\\src\\solve.jl:1066 [inlined]
 [20] #solve#51
    @ C:\\Users\\minyo\\.julia\\packages\\DiffEqBase\\yM6LF\\src\\solve.jl:1003 [inlined]
    @ SciMLSensitivity C:\\Users\\minyo\\.julia\\packages\\SciMLSensitivity\\rXkM4\\src\\quadrature_adjoint.jl:333
 [22] _adjoint_sensitivities
    @ C:\\Users\\minyo\\.julia\\packages\\SciMLSensitivity\\rXkM4\\src\\quadrature_adjoint.jl:321 [inlined]
 [23] #adjoint_sensitivities#63
    @ C:\\Users\\minyo\\.julia\\packages\\SciMLSensitivity\\rXkM4\\src\\sensitivity_interface.jl:386 [inlined]
 [54] (::OptimizationZygoteExt.var\"#38#56\"{OptimizationZygoteExt.var\"#37#55\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}})(::ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, ::ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}})
    @ OptimizationZygoteExt C:\\Users\\minyo\\.julia\\packages\\OptimizationBase\\rRpJs\\ext\\OptimizationZygoteExt.jl:93
 [55] macro expansion
    @ C:\\Users\\minyo\\.julia\\packages\\OptimizationOptimisers\\AOkbT\\src\\OptimizationOptimisers.jl:68 [inlined]
 [56] macro expansion
    @ C:\\Users\\minyo\\.julia\\packages\\Optimization\\5DEdF\\src\\utils.jl:32 [inlined]
 [57] __solve(cache::OptimizationCache{OptimizationFunction{true, AutoZygote, var\"#23#24\", OptimizationZygoteExt.var\"#38#56\"{OptimizationZygoteExt.var\"#37#55\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}}, OptimizationZygoteExt.var\"#41#59\"{OptimizationZygoteExt.var\"#37#55\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}}, OptimizationZygoteExt.var\"#45#63\", Nothing, OptimizationZygoteExt.var\"#49#67\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}, Nothing, Nothing, OptimizationZygoteExt.var\"#53#71\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}, Nothing, Nothing, Nothing, Nothing, Nothing, Optimisers.Adam, Base.Iterators.Cycle{Tuple{OptimizationBase.NullData}}, Bool, var\"#21#22\"})
    @ OptimizationOptimisers C:\\Users\\minyo\\.julia\\packages\\OptimizationOptimisers\\AOkbT\\src\\OptimizationOptimisers.jl:66
 [58] solve!(cache::OptimizationCache{OptimizationFunction{true, AutoZygote, var\"#23#24\", OptimizationZygoteExt.var\"#38#56\"{OptimizationZygoteExt.var\"#37#55\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}}, OptimizationZygoteExt.var\"#41#59\"{OptimizationZygoteExt.var\"#37#55\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}}, OptimizationZygoteExt.var\"#45#63\", Nothing, OptimizationZygoteExt.var\"#49#67\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}, Nothing, Nothing, OptimizationZygoteExt.var\"#53#71\"{OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}}, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters}, Nothing, Nothing, Nothing, Nothing, Nothing, Optimisers.Adam, Base.Iterators.Cycle{Tuple{OptimizationBase.NullData}}, Bool, var\"#21#22\"})
    @ SciMLBase C:\\Users\\minyo\\.julia\\packages\\SciMLBase\\SDjaO\\src\\solve.jl:188
 [59] solve(::OptimizationProblem{true, OptimizationFunction{true, AutoZygote, var\"#23#24\", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, ComponentVector{Float64, Vector{Float64}, Tuple{Axis{(layer_1 = ViewAxis(1:30, Axis(weight = ViewAxis(1:20, ShapedAxis((10, 2))), bias = ViewAxis(21:30, ShapedAxis((10, 1))))), layer_2 = ViewAxis(31:140, Axis(weight = ViewAxis(1:100, ShapedAxis((10, 10))), bias = ViewAxis(101:110, ShapedAxis((10, 1))))), layer_3 = ViewAxis(141:151, Axis(weight = ViewAxis(1:10, ShapedAxis((1, 10))), bias = ViewAxis(11:11, ShapedAxis((1, 1))))))}}}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, ::Optimisers.Adam; kwargs::@Kwargs{callback::var\"#21#22\", maxiters::Int64})
    @ SciMLBase C:\\Users\\minyo\\.julia\\packages\\SciMLBase\\SDjaO\\src\\solve.jl:96"
}

Based on my debugging, the loss() and predict() functions are called successfully, but the code never enters the callback function. From what I can see, I do not have any undefined variables.

Any input regarding this matter would be appreciated. Thank you for taking the time to read this :slight_smile:

What version of Julia are you using, and ]st?

I am currently running version 1.10.4 in Julia and v1.79.2 of the Julia extension in VS Code.

Running ]st gives me the following output:

Status `C:\Users\minyo\.julia\environments\v1.10\Project.toml`
  [336ed68f] CSV v0.10.14
  [b0b7db55] ComponentArrays v0.15.13
  [2445eb08] DataDrivenDiffEq v1.4.1
  [5b588203] DataDrivenSparse v0.1.2
  [a93c6f00] DataFrames v1.6.1
⌃ [82cc6244] DataInterpolations v5.0.0
  [1130ab10] DiffEqParamEstim v2.2.0
  [0c46a032] DifferentialEquations v7.13.0
⌃ [31c24e10] Distributions v0.25.108
  [a98d9a8b] Interpolations v0.15.1
  [d3d80556] LineSearches v7.2.0
⌃ [b2108857] Lux v0.5.47
⌃ [961ee093] ModelingToolkit v9.15.0
⌃ [7f7a1694] Optimization v3.24.3
  [36348300] OptimizationOptimJL v0.3.2
  [42dfb2eb] OptimizationOptimisers v0.2.1
  [500b13db] OptimizationPolyalgorithms v0.2.1
⌃ [1dea7af3] OrdinaryDiffEq v6.74.1
  [91a5bcdd] Plots v1.40.4
⌃ [1ed8b502] SciMLSensitivity v7.56.2
  [860ef19b] StableRNGs v1.0.2
  [e88e6eb3] Zygote v0.6.70
  [37e2e46d] LinearAlgebra
  [10745b16] Statistics v1.10.0
Info Packages marked with ⌃ have new versions available and may be upgradable.

The error was due to there only being 1 differential equation in my model, even though the state vector has a length of 2. I didn’t think that this would be a problem, since it worked previously when I tried it without the integrated neural network (code below).

# Define the lung dynamics model
function lung_dynamics!(dv, v, p, t)
    volume, flow = v 
    elastance, resistance = p

    # Calculate pressure using the interpolation function at time t
    pressure = pressure_interp(t)
    # Calculate flow rate using the pressure-driven equation
    flow = (pressure - volume * elastance) / resistance

    # Update volume rate of change (dV/dt)
    dv[1] = flow
end

# Define the loss function for parameter estimation
function loss_function(p)
    elastance, resistance = p
    problem = ODEProblem(modified_lung_dynamics!, initial_condition, tspan, p)
    solution = solve(problem, Tsit5(), saveat=adjusted_t_data)  
    model_volumes = solution[1, :]
    return sum((model_volumes .- measured_volume_data) .^ 2)
end

# Initial guess for the parameters
initial_guess = [0.5, 0.5]

# Fit the model to the data.
result = optimize(loss_function, initial_guess, NelderMead(), Optim.Options(iterations = 100))

# Print the estimated parameters
println("Estimated elastance: ", result.minimizer[1])
println("Estimated resistance: ", result.minimizer[2])
1 Like