Bayesian inference with PINNs for ODEs with exogenous inputs taken from data

First of all, I’m quite new to Julia and I would like to apologize if the question is too naïve, probably product of my limited understanding of how Julia handles this type of algorithms. Currently, I’m using Julia version 1.10.1. My problem is the following: I need to estimate six parameters (c1, c2, c3, c4, c5, c6) of an ODE representing the aerodynamic model of a wind turbine, and I’ve decided to do it via Bayesian inference using physics-informed neural networks (PINN) with the package NeuralPDE.jl. I have not been able to find clear examples of how to do this in the presence of exogenous inputs (Tg, θ, vr) taken from data. I wrote the ODE in out-of-place form as the documentation states, and I seem to be able to solve it correctly. But when I try to solve the ODE using a Bayesian PINN for parameter estimation, I get an error that I’m unable to understand. I think the problem may be in the way I handle the exogenous inputs in the wind_turbine(u,p,t) function, but I’m unsure about how to solve it. Any insights or guidance on how to solve this issue, or any other tips on Bayesian inference with PINNs in Julia, would be greatly appreciated. Thank you in advance!

See the code below:

using NeuralPDE, Lux, Plots, OrdinaryDiffEq, Distributions, Random, XLSX, Plots

# Define wind turbine aerodynamic model with exogenous inputs Tg, θ and vr
function wind_turbine(u, p, t)
    # Index corresponding to current time step
    idx = Int(floor((t - tspan[1]) / dt)) + 1
    # Extract current input from data
    Tg_t = Tg_data[idx]
    θ_t = θ_data[idx]
    vr_t = vr_data[idx]
    
    # Extract model parameters
    c1, c2, c3, c4, c5, c6 = p
    
    # Extract current state
    wr = u
    
    # Define model constants
    μd = 0.05
    Rr = 120.998
    ρ = 1.225
    Ar = π * Rr^2
    Jr = 321699000
    Jg = 3.777e6
    
    # Calculate tip-speed ratio λ
    λ = wr * Rr / vr_t
    
    # Calculate 1/λi (numerical approx. for Cp(θ,λ))
    λi_inv = 1 / (λ + 0.08 * θ_t) - 0.035 / (θ_t^3 + 1)
    
    # Calculate Cp (numerical approx. for Cp(θ,λ))
    Cp = c1 * (c2 * λi_inv - c3 * θ_t - c4) * exp(-c5 * λi_inv) + c6 * λ
    
    # Evaluate differential equation for rotor speed
    dwr = ((1 - μd) * 1 / (2 * wr) * ρ * Ar * vr_t^3 * Cp - Tg_t) / (Jr + Jg)
    
    return dwr
end

# Define initial-value problem.
wr0 = 0.8246
Tg0 = 19503192.0
vr0 = 13.9445
theta0 = 0.1743
u0 = wr0  # Initial state
p = [0.5176, 116, 0.4, 5, 21, 0.0068]  # Model parameters (not suitable for this specific wind turbine, but my best guess for now)
tspan = (0.0, 5.0)

# Load data from Excel sheet
function load_data(filename)
    data = XLSX.readxlsx(filename)["Ark1"]  # Assuming your data is in Sheet1 (Ark1)
    wr = data["A"][1:501]  # Assuming column A contains angular speed data, take first 501 samples
    Tg = data["B"][1:501]  # Assuming column B contains generator torque data, take first 501 samples
    θ = data["C"][1:501]  # Assuming column C contains pitch angle data, take first 501 samples
    vr = data["D"][1:501]  # Assuming column D contains wind speed data, take first 501 samples
    return wr, Tg, θ, vr
end

# Load data from Excel sheet
filename = "...\\data_14ms_cropped.xlsx" # write corresponding path
wr_data, Tg_data, θ_data, vr_data = load_data(filename)

t_data = range(tspan[1], tspan[2], length=501)  # Time points
dataset = [wr_data, Tg_data, θ_data, vr_data, t_data]

# Convert dataset into Vector{Vector{Float64}} (just in case)
dataset = [Float64.(vec) for vec in dataset]

# Define ODE problem
prob_model = ODEProblem(wind_turbine, u0, tspan, p)

# Solve the ODE using stiffness detection and auto-switching algorithm
dt = 0.01  # Sampling time
sol_model = solve(prob_model, AutoTsit5(Rosenbrock23()), saveat=dt)

# Plot trajectory and compare it to data (they logically don't resemble each other due to the unknown real wind turbine parameters)
plot(sol_model, xlabel="Time", ylabel="Rotor Speed", label="Rotor speed (model)")
plot!(t_data, wr_data, label="Rotor speed (data)")
plot!(legend=:topright)

# Define BNN architecture with adjusted input size
rng = Random.default_rng()
Random.seed!(rng, 0)
n = 15
chain = Lux.Chain(
            Lux.Dense(1, n, Lux.σ),
            Lux.Dense(n, n, Lux.σ),
            Lux.Dense(n, n, Lux.σ),
            Lux.Dense(n, 6)
        )
ps, st = Lux.setup(rng, chain) |> Lux.f64

# Define Bayesian PINN solver
alg = BNNODE(chain;
              dataset=dataset,
              draw_samples=1000,
              l2std=[0.1],
              phystd=[0.1],
              priorsNNw=(0.0, 0.05),
              param=[Normal(0.5176, 0.5), Normal(116, 50), Normal(0.4, 0.4), Normal(5, 4), Normal(21, 10), Normal(0.0068, 0.004)],
              progress=true) # I have to think carefully about the arguments in this function

# Solve the ODE using Bayesian neural network
sol_lux_pestim = solve(prob_model, alg; saveat = dt)

The error message after executing the last line is the following:

┌ Error: Exception while generating log record in module NeuralPDE at C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\advancedHMC_MCMC.jl:510
│   exception =
│    MethodError: no method matching +(::Vector{Float64}, ::Float64)
│    For element-wise addition, use broadcasting with dot syntax: array .+ scalar
│    
│    Closest candidates are:
│      +(::Any, ::Any, ::Any, ::Any...)
│       @ Base operators.jl:587
│      +(::ChainRulesCore.NoTangent, ::Any)
│       @ ChainRulesCore C:\Users\FX03NI\.julia\packages\ChainRulesCore\zgT0R\src\tangent_arithmetic.jl:59
│      +(::Any, ::ChainRulesCore.NoTangent)
│       @ ChainRulesCore C:\Users\FX03NI\.julia\packages\ChainRulesCore\zgT0R\src\tangent_arithmetic.jl:60
│      ...
│    
│    Stacktrace:
│      [1] wind_turbine(u::Vector{Float64}, p::Vector{Float64}, t::Float64)
│        @ Main .\In[20]:28
│      [2] (::ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing})(::Vector{Float64}, ::Vararg{Any})
│        @ SciMLBase C:\Users\FX03NI\.julia\packages\SciMLBase\NjslX\src\scimlfunctions.jl:2184
│      [3] (::NeuralPDE.var"#402#405"{ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}})(i::Int64)
│        @ NeuralPDE .\none:0
│      [4] iterate
│        @ .\generator.jl:47 [inlined]
│      [5] collect(itr::Base.Generator{UnitRange{Int64}, NeuralPDE.var"#402#405"{ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Matrix{Float64}}})
│        @ Base .\array.jl:834
│      [6] innerdiff(Tar::NeuralPDE.LogTargetDensity{Chain{@NamedTuple{layer_1::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_3::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_4::Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}, Nothing}, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}, layer_4::@NamedTuple{}}, GridTraining{Float64}, @NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_4::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}}, Vector{ContinuousDistribution}, Vector{Vector{Float64}}}, f::ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing}, autodiff::Bool, t::Vector{Float64}, θ::Vector{Float64}, ode_params::Vector{Float64})
│        @ NeuralPDE C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\advancedHMC_MCMC.jl:237
│      [7] getlogpdf(strategy::GridTraining{Float64}, Tar::NeuralPDE.LogTargetDensity{Chain{@NamedTuple{layer_1::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_3::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_4::Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}, Nothing}, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}, layer_4::@NamedTuple{}}, GridTraining{Float64}, @NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_4::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}}, Vector{ContinuousDistribution}, Vector{Vector{Float64}}}, f::Function, autodiff::Bool, tspan::Tuple{Float64, Float64}, ode_params::Vector{Float64}, θ::Vector{Float64})
│        @ NeuralPDE C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\advancedHMC_MCMC.jl:150
│      [8] physloglikelihood(Tar::NeuralPDE.LogTargetDensity{Chain{@NamedTuple{layer_1::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_3::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_4::Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}, Nothing}, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}, layer_3::@NamedTuple{}, layer_4::@NamedTuple{}}, GridTraining{Float64}, @NamedTuple{layer_1::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_2::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_3::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}, layer_4::@NamedTuple{weight::Matrix{Float32}, bias::Matrix{Float32}}}, Vector{ContinuousDistribution}, Vector{Vector{Float64}}}, θ::Vector{Float64})
│        @ NeuralPDE C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\advancedHMC_MCMC.jl:137
│      [9] macro expansion
│        @ .\logging.jl:373 [inlined]
│     [10] ahmc_bayesian_pinn_ode(prob::ODEProblem{Float64, Tuple{Float64, Float64}, false, Vector{Float64}, ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing}, @Kwargs{}, SciMLBase.StandardODEProblem}, chain::Chain{@NamedTuple{layer_1::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_3::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_4::Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}, Nothing}; strategy::Type{GridTraining}, dataset::Vector{Vector{Float64}}, init_params::Nothing, draw_samples::Int64, physdt::Float64, l2std::Vector{Float64}, phystd::Vector{Float64}, priorsNNw::Tuple{Float64, Float64}, param::Vector{Normal{Float64}}, nchains::Int64, autodiff::Bool, Kernel::Type, Adaptorkwargs::@NamedTuple{Adaptor::UnionAll, Metric::UnionAll, targetacceptancerate::Float64}, Integratorkwargs::@NamedTuple{Integrator::UnionAll}, MCMCkwargs::@NamedTuple{n_leapfrog::Int64}, progress::Bool, verbose::Bool)
│        @ NeuralPDE C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\advancedHMC_MCMC.jl:510
│     [11] __solve(::ODEProblem{Float64, Tuple{Float64, Float64}, false, Vector{Float64}, ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing}, @Kwargs{}, SciMLBase.StandardODEProblem}, ::BNNODE{Chain{@NamedTuple{layer_1::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_3::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_4::Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}, Nothing}, UnionAll, @NamedTuple{Integrator::UnionAll}, @NamedTuple{Adaptor::UnionAll, Metric::UnionAll, targetacceptancerate::Float64}, @NamedTuple{n_leapfrog::Int64}, Nothing, Nothing, Vector{Normal{Float64}}, Vector{Vector{Float64}}}; dt::Nothing, timeseries_errors::Bool, save_everystep::Bool, adaptive::Bool, abstol::Float32, reltol::Float32, verbose::Bool, saveat::Float64, maxiters::Nothing, numensemble::Int64)
│        @ NeuralPDE C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\BPINN_ode.jl:197
│     [12] __solve
│        @ C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\BPINN_ode.jl:171 [inlined]
│     [13] #solve_call#44
│        @ C:\Users\FX03NI\.julia\packages\DiffEqBase\O8cUq\src\solve.jl:612 [inlined]
│     [14] solve_call
│        @ C:\Users\FX03NI\.julia\packages\DiffEqBase\O8cUq\src\solve.jl:569 [inlined]
│     [15] solve_up(prob::ODEProblem{Float64, Tuple{Float64, Float64}, false, Vector{Float64}, ODEFunction{false, SciMLBase.AutoSpecialize, typeof(wind_turbine), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing}, @Kwargs{}, SciMLBase.StandardODEProblem}, sensealg::Nothing, u0::Float64, p::Vector{Float64}, args::BNNODE{Chain{@NamedTuple{layer_1::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_2::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_3::Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, layer_4::Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}, Nothing}, UnionAll, @NamedTuple{Integrator::UnionAll}, @NamedTuple{Adaptor::UnionAll, Metric::UnionAll, targetacceptancerate::Float64}, @NamedTuple{n_leapfrog::Int64}, Nothing, Nothing, Vector{Normal{Float64}}, Vector{Vector{Float64}}}; kwargs::@Kwargs{saveat::Float64})
│        @ DiffEqBase C:\Users\FX03NI\.julia\packages\DiffEqBase\O8cUq\src\solve.jl:1080
│     [16] solve_up
│        @ C:\Users\FX03NI\.julia\packages\DiffEqBase\O8cUq\src\solve.jl:1066 [inlined]
│     [17] #solve#51
│        @ C:\Users\FX03NI\.julia\packages\DiffEqBase\O8cUq\src\solve.jl:1003 [inlined]
│     [18] top-level scope
│        @ In[43]:2
│     [19] eval
│        @ .\boot.jl:385 [inlined]
│     [20] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
│        @ Base .\loading.jl:2076
│     [21] softscope_include_string(m::Module, code::String, filename::String)
│        @ SoftGlobalScope C:\Users\FX03NI\.julia\packages\SoftGlobalScope\u4UzH\src\SoftGlobalScope.jl:65
│     [22] execute_request(socket::ZMQ.Socket, msg::IJulia.Msg)
│        @ IJulia C:\Users\FX03NI\.julia\packages\IJulia\Vo51o\src\execute_request.jl:67
│     [23] #invokelatest#2
│        @ .\essentials.jl:892 [inlined]
│     [24] invokelatest
│        @ .\essentials.jl:889 [inlined]
│     [25] eventloop(socket::ZMQ.Socket)
│        @ IJulia C:\Users\FX03NI\.julia\packages\IJulia\Vo51o\src\eventloop.jl:8
│     [26] (::IJulia.var"#15#18")()
│        @ IJulia C:\Users\FX03NI\.julia\packages\IJulia\Vo51o\src\eventloop.jl:38
└ @ NeuralPDE C:\Users\FX03NI\.julia\packages\NeuralPDE\Xp1OF\src\advancedHMC_MCMC.jl:510
┌ Info: Current Prior Log-likelihood : 
└   priorweights(ℓπ, initial_θ) = -15096.924089369746
┌ Info: Current MSE against dataset Log-likelihood : 
└   L2LossData(ℓπ, initial_θ) = -95277.57901401068

Data (wr, Tg):

wr_data = [0.8212261168549496, 0.8380977341328735, 0.8236625481467496, 0.8240543672129728, 0.8250352034986291, 0.8268818386686797, 0.8334576553288775, 0.8270562179743942, 0.8355394986087811, 0.8188712425433945, 0.8269304367148607, 0.8190543403201025, 0.8174300869792841, 0.8003322855517404, 0.8135022536985811, 0.8065847131606709, 0.8112185975632649, 0.8156818046766139, 0.8132578349710595, 0.8314459052761878, 0.794619595178375, 0.8062807682334083, 0.8057843260725241, 0.8002969268917778, 0.804721806552491, 0.8233449992406878, 0.8119219953970644, 0.7982522038743983, 0.7980924315935516, 0.7996586299951158, 0.8001207024703654, 0.8032568321135515, 0.8140244677283766, 0.8201336087801956, 0.8143930188784186, 0.806372407431599, 0.804901514930225, 0.7997785669597273, 0.8097552087247892, 0.798175322084142, 0.8109082387114198, 0.7949096135751348, 0.812078793011493, 0.8021292491081808, 0.8037363736008888, 0.7845203051241075, 0.7896242710291562, 0.7894358900027427, 0.8043566677999464, 0.8021426078700665, 0.7725990877487255, 0.7922650094209008, 0.8025213724086037, 0.7996968654745262, 0.8006933337507858, 0.7966893906669854, 0.7907537729837157, 0.7926891743739529, 0.7981813308964615, 0.7922238994876885, 0.7790697689523524, 0.7878965368066566, 0.7713881042081223, 0.8034136500372623, 0.7940974165245598, 0.7912594186399601, 0.78518609702941, 0.7841722594771796, 0.788367151052553, 0.778774641639401, 0.7801218462222664, 0.7813086215765933, 0.7843302557613947, 0.7884683314739772, 0.7901483581773442, 0.7846907516026483, 0.8016343254432033, 0.7789045373015712, 0.7772901667504889, 0.7798917949111743, 0.76907833977886, 0.7769926904055524, 0.7856886087798507, 0.7788760305930711, 0.7802739967039826, 0.78168932751066, 0.7774687693845117, 0.7924289257659901, 0.776699338932644, 0.7838511560139801, 0.7779181963259246, 0.7794269297364596, 0.7968796084128638, 0.7759786058158862, 0.790953757443552, 0.7806513189939249, 0.7886692945213784, 0.7697498131715125, 0.7862018938154879, 0.7835588937389127, 0.7771868928829573, 0.7789939382055796, 0.7651837612702929, 0.7725566618643377, 0.776316120382919, 0.7740778529770221, 0.7800040348796765, 0.8034882019139905, 0.7880633443468137, 0.7777001706367517, 0.7740170279141095, 0.7812821892894956, 0.7844067263178472, 0.7785647569974724, 0.7721634308571965, 0.7657192342064852, 0.7722153008856839, 0.7726279911074234, 0.7726632799763046, 0.7776501468877346, 0.7843457526708039, 0.7804415600347056, 0.7690994275929749, 0.7830230447713021, 0.7633388346163263, 0.7699035002152798, 0.7798435771882257, 0.7697125844631906, 0.7711587533941067, 0.7680387850073052, 0.7698366601661567, 0.7674166775957147, 0.7732209870611717, 0.7733187424023353, 0.7797871343324576, 0.7583850975774784, 0.7919769768561262, 0.7673479211420763, 0.7816802181781636, 0.7767155465164487, 0.7831395564438646, 0.7856139176835993, 0.7849906830856784, 0.7853736319551634, 0.778891500538459, 0.7819203074225752, 0.7768429623997756, 0.7675718165089377, 0.7814342039952533, 0.7803133480508896, 0.7697763794792286, 0.789330721518402, 0.7857582283032658, 0.7888511696249569, 0.7861521641642654, 0.7910367877887662, 0.785823589330202, 0.7917168187234301, 0.780130499447352, 0.7847000840122811, 0.7872390062379271, 0.785618804345663, 0.7844022748960121, 0.7713751982740571, 0.7776985687890869, 0.7843898452095515, 0.7833862163253508, 0.7934558461406959, 0.7734843126776623, 0.7942011989603114, 0.7916690892314512, 0.7816500535128984, 0.783245037555283, 0.7765187257278121, 0.7802740923648536, 0.7918099406440016, 0.7884684181657559, 0.7884297589825708, 0.8005676152637543, 0.7969112551714815, 0.7794290294703805, 0.7735250364402747, 0.789504105843174, 0.7852366717012799, 0.7837786663793241, 0.7789859520680555, 0.7976923549132241, 0.780841067513197, 0.8012176866293751, 0.789146595447003, 0.7993679454620221, 0.7793183149298468, 0.7921865246780697, 0.7993548341745274, 0.7965230839430553, 0.802891259553005, 0.7832241248254244, 0.7849734571359088, 0.7993532803802806, 0.8006801820000053, 0.7936169298713532, 0.7975727012601131, 0.783954199257282, 0.7796470917937328, 0.789039920544749, 0.7813786907066501, 0.7977916591693163, 0.8027717029640908, 0.7881630028445457, 0.7886453655310879, 0.7901080288524488, 0.8025863209032646, 0.7972563598849872, 0.8127209982107063, 0.7916954225535682, 0.801845241665248, 0.7961617051810459, 0.8096790269872679, 0.7945763873167621, 0.8018125374658646, 0.8087938548136966, 0.7989412380127845, 0.7996237661771235, 0.7919397235617718, 0.8021987225568653, 0.7882556798038058, 0.7851777961295042, 0.7857234414708121, 0.8076497438240173, 0.7843382624537271, 0.7845421082998032, 0.7982888630181494, 0.8017110257075418, 0.7988444015466231, 0.8011175640146057, 0.7995884104352207, 0.7945657766626326, 0.8019434216954056, 0.8100347949374257, 0.8015751056272505, 0.7860634919937205, 0.7920175633267168, 0.7957550213004233, 0.7927640570247003, 0.8014491652112872, 0.79875744983816, 0.7888173949379718, 0.7937255510888527, 0.7832175102707963, 0.7980069396806891, 0.7951529335129354, 0.7941681105364995, 0.7934301389771419, 0.8000937127668816, 0.7873273592484866, 0.7776067426874117, 0.7778080425911346, 0.7936341144804387, 0.7962144750053256, 0.8010614231821008, 0.7968868762084804, 0.8061936258070648, 0.7876332276078191, 0.7918725981963243, 0.7810209327115233, 0.7898059919807616, 0.7831432068221081, 0.7814428886161361, 0.7842499536427889, 0.7939830957909396, 0.7863047384816725, 0.7772994949021558, 0.7807492593799895, 0.785305395342494, 0.7739501228257282, 0.7811393740832063, 0.7746733533225817, 0.7864500379192657, 0.8017873061316816, 0.7806488775768515, 0.7961133674134184, 0.7767493523594231, 0.7926979793052669, 0.7882842986058368, 0.7872176343210983, 0.7752822856324962, 0.7935236951324606, 0.7858585319648455, 0.7714995232050164, 0.7824143400198202, 0.7877465156424529, 0.7888810974290015, 0.7809604217690053, 0.7772629841553806, 0.7711355342027133, 0.7820434939815948, 0.7871029432262984, 0.7890460549686102, 0.7758616611925552, 0.7806864247701123, 0.7833958338449155, 0.7889698514064657, 0.7794632504015235, 0.7959751291022152, 0.7829006384172619, 0.7782655403780037, 0.7756992216044837, 0.7668242001323821, 0.7724994560336605, 0.776545126422134, 0.7850438492450494, 0.787270248327528, 0.7664320171984546, 0.7865089780114228, 0.772029569342831, 0.792421643779087, 0.7831002301096011, 0.7937172311830747, 0.7818629730483274, 0.7947117131657337, 0.7758177571370323, 0.7912271572347191, 0.7890271052873308, 0.7706206663103867, 0.7811809333255754, 0.7750334112379695, 0.7822338220436571, 0.7747002026783618, 0.7827609598289667, 0.7894897562045347, 0.7785943302421247, 0.7767684920235752, 0.7700782529343159, 0.7831096012503198, 0.7876931198778249, 0.7897577751355874, 0.785838886485919, 0.7802950057107355, 0.7821514359925009, 0.7817590468795099, 0.7894489076557668, 0.7798598739336634, 0.7915377470344717, 0.7770487805706928, 0.7809240333836643, 0.7898232216932, 0.7743697422534254, 0.7739115182533607, 0.7877504936451829, 0.7775011022398114, 0.8000101854040332, 0.7947937684429537, 0.7772384178706452, 0.7768206303521747, 0.7876043016704498, 0.7820066458085178, 0.7740531009163394, 0.7881273174006377, 0.7756965425769458, 0.7846065561205966, 0.776851921682045, 0.7858486340804405, 0.7786635637170581, 0.774078198846006, 0.7959698098670093, 0.7898976275378764, 0.7891230615462801, 0.7717958537618167, 0.7861847630565055, 0.7918479519206516, 0.7857844777398458, 0.7874236739402226, 0.7794217999049615, 0.785786439368261, 0.7711201782133048, 0.8038408919163633, 0.7848140587487822, 0.7794135234062893, 0.7841431878134043, 0.7946092304902402, 0.7708498153564849, 0.7927128536019257, 0.7981208892572043, 0.7850679119656219, 0.784274103419344, 0.792814668230845, 0.7920967669417367, 0.7946205401631731, 0.7910548539924553, 0.7880746194367133, 0.7992565201813883, 0.7928836290127471, 0.7835407278278548, 0.7895357935522571, 0.7937884318558855, 0.7830204311700022, 0.7932823086454233, 0.801096352606319, 0.8029402678316392, 0.7993844464579739, 0.7854236632172993, 0.7868181395626043, 0.7932043036460458, 0.7939760512049477, 0.7899451834731457, 0.8091962110189206, 0.7849129828683876, 0.7946883612633682, 0.7762098792708465, 0.8038056117184813, 0.8022403047406029, 0.7988428969801679, 0.7907645228896664, 0.7954449137806645, 0.7858781139148188, 0.7761077884619777, 0.8094899779825371, 0.7977980396255183, 0.7821865264009462, 0.8095951368779737, 0.7980339639612448, 0.7992234884559516, 0.8141165798030265, 0.8066708142812883, 0.7973708402400108, 0.7992957644319747, 0.8076173379441656, 0.8020142717192973, 0.8093548930921148, 0.8006795185097654, 0.8057938596566434, 0.7922966084486927, 0.7987019000461916, 0.8216101922210423, 0.7999710008048966, 0.8040043410600988, 0.8265237600834399, 0.7947772918879276, 0.8116201190792588, 0.8226889595478034, 0.8333611085944224, 0.8082603722905951, 0.8104252201196457, 0.8181000985456333, 0.8161821760694243, 0.8107275586856304, 0.8147167059604583, 0.8097472299985498, 0.797811403364491, 0.8202705316071879, 0.807710392659657, 0.8082941794513145, 0.7884969848979001, 0.8191276482907951, 0.8093542446103117, 0.8149919519472112, 0.8179845960590746, 0.8208446732411492, 0.8115081935449496, 0.7979558227347482, 0.8092658745126404, 0.8031004160044205, 0.8246414473059412, 0.8081651824692964, 0.8073391196468975, 0.8065273944037231, 0.8239793065238143, 0.827136258729476, 0.8077452226302112, 0.8089480629131387, 0.8116363079119404, 0.8108443833228951, 0.8238285528186068, 0.8257128585333997, 0.8212830829160043, 0.8169382249833211, 0.8176813574555081, 0.7983995367793212, 0.7976818593284059, 0.8083346272236271, 0.8177200204571562, 0.8124077396760356, 0.8258631333446184, 0.8087533567654556, 0.8144938067800762, 0.8166704550887778, 0.8143804355958731, 0.8026170334014237, 0.8082148930011509, 0.8133767343053023, 0.8175418702063141, 0.8281908039900492, 0.8145696476202191, 0.8124402492309618, 0.8175645398256356, 0.8168904906790917, 0.8164828216320171, 0.8166775764088988, 0.8154221953663385, 0.7967911599905775, 0.81050900422315]
Tg_data = [1.9476740965225182e7, 1.9439862899300076e7, 1.9767786315952912e7, 1.975059589304701e7, 1.9376976491131086e7, 1.9450922169227198e7, 1.989377208204937e7, 1.9331738843609717e7, 1.987758216948938e7, 1.9867848583592914e7, 1.9601348121578846e7, 1.937208535329919e7, 1.9472260646177955e7, 1.9785523530808352e7, 1.9625470657827657e7, 1.983106109315583e7, 1.9711722696994692e7, 2.001649160466327e7, 1.9415202262903135e7, 1.9657542177637335e7, 1.9786500185718942e7, 1.97526290639845e7, 1.995504194341402e7, 1.988852095939581e7, 1.957394967151833e7, 1.9452333627264682e7, 2.0026857914656606e7, 1.931722738285469e7, 1.9772866980637804e7, 1.9883325821478635e7, 1.9646728733317275e7, 1.9764505298615087e7, 1.9957778739564143e7, 1.983482678600571e7, 2.026657170427341e7, 1.9915249441471823e7, 1.9698784225305323e7, 2.026815944796991e7, 1.9830850878200468e7, 2.0117527533674613e7, 1.9958438240168165e7, 2.006696287533024e7, 2.005026455948653e7, 1.991977196562781e7, 2.0117196422172442e7, 2.0374993832361434e7, 2.0498778230399765e7, 2.0117851772465117e7, 2.0089236536325403e7, 2.015494461517833e7, 2.0466220576594554e7, 2.0508686600456286e7, 1.9967686887733325e7, 2.006914555671472e7, 2.0412107292799003e7, 2.0165652421052627e7, 2.039287908677167e7, 2.0598232936660957e7, 2.0162052114687387e7, 2.041194288124285e7, 2.0971728232187007e7, 2.012292668357036e7, 2.0350248990024928e7, 2.0539879517869197e7, 2.015526585498712e7, 2.0186346471356954e7, 2.0115213281100452e7, 2.0870509298743665e7, 2.037093285609589e7, 2.0198835839263402e7, 2.056707302256651e7, 2.0396699678400956e7, 2.079974124100114e7, 2.0479000791111056e7, 2.0486656732380137e7, 2.0754375935072128e7, 2.0454078514845543e7, 2.01266083903905e7, 2.071887122804562e7, 2.0781211593311105e7, 2.0590304071082342e7, 2.034450293725645e7, 2.050888338509164e7, 2.0371536952884186e7, 2.0713622700007103e7, 2.0663032750506308e7, 2.0803164328308027e7, 2.0738413735862862e7, 2.093763528536701e7, 2.084959084220987e7, 2.077510021568875e7, 2.0708726403784163e7, 2.0696471144811146e7, 2.079212185094665e7, 2.085691947548279e7, 2.025100002349638e7, 2.1197024454755098e7, 2.0720765118479535e7, 2.0944537797471624e7, 2.1134934884375498e7, 2.0415410747940086e7, 2.1014151888419498e7, 2.1080447551497117e7, 2.063135834833959e7, 2.0842367672840465e7, 2.0772693157811686e7, 2.0629496500538945e7, 2.0686620448102716e7, 2.0619308804432176e7, 2.0822534375110753e7, 2.1199772073144574e7, 2.082589439488125e7, 2.1158537593904592e7, 2.0729515243150763e7, 2.0854383637362126e7, 2.0741899955086417e7, 2.1176256092603095e7, 2.0769816585070707e7, 2.1041153987373184e7, 2.0441675957674935e7, 2.099531999698766e7, 2.1267910346388336e7, 2.071018109026796e7, 2.073217100340439e7, 2.0708424010493852e7, 2.114698845172296e7, 2.0930545062303744e7, 2.076401379912978e7, 2.0839746862816736e7, 2.0962770338717613e7, 2.093772772015214e7, 2.0726548619154435e7, 2.0862620947064415e7, 2.0745798610365286e7, 2.0986337328226063e7, 2.124074809609949e7, 2.0988888442464106e7, 2.0441589028404605e7, 2.097243981100409e7, 2.094407362755446e7, 2.0991781538974486e7, 2.0808537012878496e7, 2.0771758697865028e7, 2.1034556741474856e7, 2.0600195483524278e7, 2.0781137942710333e7, 2.0519294249926616e7, 2.0951760114855148e7, 2.0722443088103384e7, 2.051947573653138e7, 2.0934542593421966e7, 2.0838715691214476e7, 2.1011679018534154e7, 2.077052695547511e7, 2.073216492533055e7, 2.1015919973436132e7, 2.0836108929908723e7, 2.099027662296014e7, 2.053451840788793e7, 2.099882684474671e7, 2.0558827475870237e7, 2.0958546592747636e7, 2.082111680286653e7, 2.0897698464960475e7, 2.052367728768505e7, 2.0427047178934935e7, 2.0549169407519486e7, 2.042015121654437e7, 2.096281746632131e7, 2.0631614182581358e7, 2.0768607267127898e7, 2.067254835350426e7, 2.0402163703746412e7, 2.027137190352629e7, 2.063282420096186e7, 2.0899341673022103e7, 2.072238264312694e7, 2.0399278587120615e7, 2.02323106026369e7, 2.0210342334627323e7, 2.046718521814348e7, 2.0048465657704856e7, 2.066862250382113e7, 2.0722989043942187e7, 2.0414549716592524e7, 2.0466368615911447e7, 2.0326296907687563e7, 2.0567358445585027e7, 2.041939093027427e7, 2.0402805901428983e7, 2.062952038090013e7, 2.0465910538747296e7, 1.9907499874388296e7, 2.051170855534357e7, 2.054896891500289e7, 2.041379680296136e7, 2.0529110650911104e7, 2.0181606368034236e7, 2.011155726984081e7, 2.059757415277848e7, 2.0414466940980464e7, 2.0696287669463843e7, 2.0164561261062518e7, 2.0053768285014436e7, 2.041131696636773e7, 2.0556410629043307e7, 2.0110689521335978e7, 2.0310479244134326e7, 2.017451670120823e7, 2.051983982547759e7, 2.0308880503157463e7, 2.0459185380369987e7, 2.0090439190019075e7, 2.0125928436288666e7, 2.0192161570156835e7, 2.0261254911742445e7, 2.0649189012699887e7, 2.063038169076732e7, 2.006505217163511e7, 2.034920283093054e7, 2.011542129356516e7, 1.984040016920075e7, 2.0295070875958644e7, 1.984336220087781e7, 2.0375081405358188e7, 2.0271250796620075e7, 2.0420978973810293e7, 1.9615421653585445e7, 2.015375142551677e7, 1.9937784440996982e7, 2.00660649201496e7, 1.9881292567587882e7, 1.9930124252114892e7, 2.04797808685674e7, 2.026439737735257e7, 2.0581775416923817e7, 2.0265670789176445e7, 2.0181356208057787e7, 1.9997119725598678e7, 1.9974315282378137e7, 2.0192534902743958e7, 2.022995183642547e7, 2.0192429228574064e7, 2.009299152972648e7, 2.0028855276250895e7, 2.0017811189596113e7, 2.0425509956023168e7, 2.0137949099442545e7, 2.0476469797397953e7, 2.036941323956405e7, 2.0647129216869626e7, 2.010511219886268e7, 2.0254388032062165e7, 1.9937252597245835e7, 1.999170085091814e7, 2.059889523660745e7, 2.0383530189506743e7, 2.017272603752279e7, 2.041842584088956e7, 2.007831360077167e7, 2.051815453261512e7, 2.057147572427896e7, 2.0197696826380394e7, 2.0279188860609498e7, 2.0440884281693988e7, 2.067098979775607e7, 2.0305025849608343e7, 2.0700203421530638e7, 2.0608039114919826e7, 2.0421019898103997e7, 2.0553706029788073e7, 2.0602357165677875e7, 2.0260215238951735e7, 2.0368158304756355e7, 2.078277674536505e7, 2.01093816083355e7, 2.0521740673094567e7, 2.0803720281277206e7, 2.0311699832528353e7, 2.0421928750328306e7, 2.0817524785461627e7, 2.0745109769070655e7, 2.066267390266542e7, 2.0486024526037067e7, 2.0585461180104565e7, 2.0405637803805076e7, 2.071899581642066e7, 2.0873398098104395e7, 2.0530290556044135e7, 2.05551661704981e7, 2.057704296407216e7, 2.0727837734371807e7, 2.0742037043746065e7, 2.0513014372272037e7, 2.0942434480669133e7, 2.088146645390627e7, 2.0500348724813603e7, 2.0699802172488883e7, 2.0969175556419972e7, 2.056457499496664e7, 2.0565403037167605e7, 2.117166806165849e7, 2.092095400687608e7, 2.0614388317713693e7, 2.088699744047668e7, 2.072620383423154e7, 2.096186346570324e7, 2.1077811438995466e7, 2.046891679475831e7, 2.0848414185104363e7, 2.083148724387039e7, 2.06811136381933e7, 2.060016080998289e7, 2.0459170499739453e7, 2.048080956602571e7, 2.070275982247059e7, 2.0719923472853247e7, 2.0892964341350142e7, 2.0886393544557214e7, 2.0626558102833383e7, 2.046827313290005e7, 2.069416999857598e7, 2.0803023811814215e7, 2.0592977043664083e7, 2.0657208363460913e7, 2.04773667942856e7, 2.0550606311585985e7, 2.084501478322706e7, 2.0195519108712744e7, 2.0449322865094844e7, 2.078438094876101e7, 2.0918463963425472e7, 2.074530458678474e7, 2.0883366192726094e7, 2.0605031501134098e7, 2.0866142834534787e7, 2.0700937498860054e7, 2.055740697237451e7, 2.0465930088204302e7, 2.07505694276642e7, 2.073488189187099e7, 2.031130658299552e7, 2.0865181981018834e7, 2.056507798740979e7, 2.050661026918785e7, 2.0739135863436647e7, 2.067107665515634e7, 2.057046617395571e7, 2.0630890768481933e7, 2.0441463024777398e7, 2.0414251458346535e7, 2.0463860099329602e7, 2.0362753642369572e7, 2.072939653683322e7, 2.0549319858201247e7, 2.050648310094485e7, 2.0636963465638563e7, 2.0625952799675472e7, 2.037895247658202e7, 2.0752527211122524e7, 2.030820050673989e7, 2.0412447575120877e7, 2.0641554518151097e7, 2.0790013369720582e7, 2.0836572412837755e7, 2.055358161186589e7, 2.047055792057408e7, 2.0476293584887147e7, 2.07037562159913e7, 2.076303266674312e7, 2.0440056181926053e7, 2.0567249242137354e7, 2.0829385892866064e7, 2.061751848284327e7, 2.0832640766573958e7, 2.0661750559224796e7, 2.0052320609225146e7, 2.0473532255711265e7, 2.007648694339135e7, 2.0360946270626396e7, 2.0825367821756683e7, 2.076954524565832e7, 2.030092095214793e7, 2.064504104732515e7, 2.03743932128431e7, 2.066472405722942e7, 2.0258309782981224e7, 2.0120758810812686e7, 2.0432578296263892e7, 2.054135519409161e7, 1.998148267945137e7, 1.9920374645475306e7, 2.042108965735869e7, 2.0662228374913685e7, 2.031579540911978e7, 2.0339253066675756e7, 2.007203807529851e7, 2.0686885334127586e7, 2.0395988072274484e7, 2.0203453871233966e7, 2.036784736063346e7, 2.01637239492126e7, 2.0488114011267908e7, 2.049369289381583e7, 2.0413740860148396e7, 2.0278122449650764e7, 2.0381690104105953e7, 2.0152088978750627e7, 2.0359085836663634e7, 2.0437231718059104e7, 2.031688451851921e7, 2.027855905592064e7, 2.0478331655491587e7, 2.0425224255466197e7, 2.0211008384890873e7, 2.0159332733417958e7, 2.058492906972135e7, 2.0036332505646344e7, 2.027594190998425e7, 2.026394583824963e7, 2.040988369450841e7, 2.014200533507233e7, 2.064031383295861e7, 2.017626787152113e7, 2.040201326541477e7, 2.0182784451017965e7, 2.032097165759437e7, 1.99352169239525e7, 2.0189613642820947e7, 1.982451139359677e7, 1.9902282171876602e7, 2.0091219380972266e7, 1.9714072456671618e7, 2.014826697772682e7, 1.9658681821439035e7, 1.9950402361063488e7, 1.9836979889183998e7, 1.98708485997446e7, 2.006607607780539e7, 1.9489087757220596e7, 2.0215554906958353e7, 1.998081897652744e7, 1.9900115506895505e7, 2.011858219916424e7, 2.012206156467354e7, 2.0176977778062917e7, 2.0175525997146577e7, 2.0101898618644163e7, 1.9931031145034336e7, 1.9955172255570207e7, 1.9780151503113125e7, 1.9587935565365985e7, 1.9717469326590355e7, 1.968336962257632e7, 1.984538786557086e7, 1.9772476414426096e7, 1.9716980254671138e7, 1.952297345987598e7, 1.9819605856147442e7, 1.96248135112413e7, 1.984491124984302e7, 1.9901341929251425e7, 1.9697743936137367e7, 1.9807797341474432e7, 1.9832134988606174e7, 1.9992832613466945e7, 1.9629006090373997e7, 1.9999709437824417e7, 1.9801785267047644e7, 1.992237144247372e7, 1.9899708870062344e7, 1.9668779407340374e7, 1.968557450776084e7, 1.9693501870821893e7, 2.00304304862785e7, 1.9939769632189654e7, 1.955119077445802e7, 2.006913022438515e7, 1.9876297533516604e7, 1.9709002786590707e7, 1.9450415895709295e7, 1.9971652812327668e7, 1.9706790617846757e7, 2.028580808043706e7, 1.9960026194783144e7, 2.0059985815954294e7, 1.986815806268556e7, 1.963469694879995e7, 1.977539447851393e7, 1.9919034383156303e7, 1.984620676543988e7, 1.952722096112599e7, 1.953764975520838e7, 2.001679025419081e7, 1.988800950936638e7, 2.000216391447059e7, 1.9506124555344705e7, 1.997283857283738e7, 1.9854912331901286e7, 1.9832713814487122e7, 1.9761856189112842e7]

Data (θ, vr):

θ_data = [0.17347036342425234, 0.17278705971111966, 0.17426116295271185, 0.17382033945758513, 0.1761054555012935, 0.17351292585669117, 0.17392697346188762, 0.1721813695184033, 0.1757461570663251, 0.1752926485105466, 0.1761839602342967, 0.17395868760511335, 0.17273107041015978, 0.17457132706838346, 0.17218867281437397, 0.17414621423564802, 0.17425042628813608, 0.16836021816916816, 0.17533305819765038, 0.1744230604523429, 0.17118485940430053, 0.17405493835822322, 0.16866530169665958, 0.17126856613591548, 0.1710613184141791, 0.17362596174561515, 0.16865588200815446, 0.16971338337419128, 0.16897760526531586, 0.1686125793198901, 0.17127461049337458, 0.168569439558, 0.16994540909859537, 0.16814082597227045, 0.1693142620481324, 0.16802815607461877, 0.16730297623086773, 0.1686531361738511, 0.16637573292986624, 0.16589701436788468, 0.16782597539397753, 0.16550534080997936, 0.16665654900533053, 0.1647778198589098, 0.164731758450334, 0.1698833268027402, 0.16502690481887347, 0.16479469843434685, 0.16306628577103158, 0.1626741543429096, 0.166204130176553, 0.16380437292749894, 0.16440543376074318, 0.16229090685081987, 0.16342743172213525, 0.16283148304589545, 0.16429833707642222, 0.1617340941036235, 0.1635819542522318, 0.16126731612675602, 0.16105578973628198, 0.16190159657981346, 0.16086232163333997, 0.16542401107015786, 0.15995687132219952, 0.15954692241355126, 0.15938531924268923, 0.16133983786668765, 0.15773697409764878, 0.16143801961596566, 0.1579402491237887, 0.16142044241001977, 0.15986710159174716, 0.16068067077053572, 0.15918364446139763, 0.15693547458347484, 0.15750445953498762, 0.15868537164477925, 0.1551207272955045, 0.15782958163511238, 0.1561419895862287, 0.15688273167635605, 0.158315029733553, 0.15546438656406766, 0.156420630994807, 0.15795893702717517, 0.15523032506908116, 0.15769416515751256, 0.15743561599587802, 0.15435275681750998, 0.15468414777432843, 0.15468525872504665, 0.15311986311380366, 0.15235773150226142, 0.15513393075910198, 0.15261728514578324, 0.15382499563129065, 0.15261491107895567, 0.15697897376001835, 0.1581641123736977, 0.15258473193280148, 0.1540189942822863, 0.15232089404775015, 0.1543252648807093, 0.1542999865425203, 0.14944258573193603, 0.15233846583677946, 0.15253233589285, 0.15039785540736159, 0.15200044391593973, 0.15166191462392606, 0.1539111339919344, 0.15135127333877432, 0.15365302123975066, 0.14982617222381728, 0.15144422163473442, 0.14922564000038652, 0.15085370318856467, 0.15061917758962193, 0.15223707285386592, 0.15072640753421176, 0.14799326931718204, 0.15040877175953118, 0.15057803912002304, 0.1472890128499841, 0.153189860898925, 0.15047915473808118, 0.14811550527771347, 0.15012989217509165, 0.1521808096013161, 0.15051288429387288, 0.14729502139761766, 0.15169086362279274, 0.14693717314525448, 0.1487392114451872, 0.15048175539886063, 0.14890083805753174, 0.14965191527478983, 0.1493920832675354, 0.15022770590743065, 0.1485291530505856, 0.1504182269348514, 0.14672491350641806, 0.15142972455359827, 0.14851080511964063, 0.1468566718885798, 0.15019923238317145, 0.14837358842791856, 0.14718493122776904, 0.1475578338437295, 0.1486414922930866, 0.14907507475705567, 0.14808816857862414, 0.1491298851026224, 0.15024054989585248, 0.14764534122229866, 0.14846628421566363, 0.15057536501134208, 0.14794457698511035, 0.1482597317142855, 0.14815245467640267, 0.14813350227040262, 0.1477948170134642, 0.14867773184383756, 0.14878795782512957, 0.14725200627095092, 0.15114350283477732, 0.15082565781548532, 0.15215959507178295, 0.14751400675620663, 0.1508482483424069, 0.14950114492338326, 0.14691914651001256, 0.149636619726135, 0.1509970152834429, 0.1481916289553313, 0.14830985282078435, 0.15157599470469246, 0.14986199859212015, 0.15274448073497537, 0.15034214207326171, 0.15029554567186143, 0.14967594900965883, 0.14964761323181797, 0.1486907130849705, 0.14982468743890073, 0.15121721621710416, 0.1480471101741656, 0.15338463228533572, 0.15230927121930724, 0.1531116748782297, 0.15344849340789302, 0.15118513405464082, 0.1538313384970439, 0.1514368260012861, 0.15236761132351834, 0.15114141399382006, 0.15296827364775292, 0.15081205157950092, 0.15036347097712127, 0.15208477993677724, 0.15262370194451788, 0.15183942593432362, 0.15279236276764124, 0.15279594086390189, 0.15475762547633948, 0.15116797250778616, 0.15372612723853143, 0.15211150169436258, 0.1559791393609116, 0.15370222053719, 0.1517705100173356, 0.15414975739894032, 0.1487761283081923, 0.15218631388896148, 0.15386681490513512, 0.1533220565976505, 0.15209460183662868, 0.15412609982363465, 0.15277265449865882, 0.15409719786489437, 0.15130730395844766, 0.15313059228284448, 0.1557828841692779, 0.15212232666103906, 0.15365712996834305, 0.15455027725404255, 0.15602482243027443, 0.1550348235344537, 0.15411455206830535, 0.1546760146100976, 0.155984793824255, 0.15668083446818776, 0.15608084340305553, 0.15641978998577882, 0.15737863348766817, 0.15825136547054097, 0.1552545617916057, 0.15546276752093968, 0.15825047121279553, 0.15560424883443075, 0.15641314404812978, 0.15722774686188856, 0.15713729030049778, 0.1578023327306516, 0.15479359999490125, 0.1579119440142812, 0.1553288011288763, 0.15752373485054555, 0.15721814047889757, 0.1549296371791372, 0.15695783872392416, 0.15554871272232518, 0.15633118358637954, 0.15529901132332066, 0.15456549749297638, 0.1573013320654305, 0.15633213076128838, 0.15784399421363093, 0.155450237089754, 0.151880715839393, 0.15614967399470855, 0.1554546577188444, 0.15794349135769176, 0.15626008345646192, 0.15764241401761833, 0.15637857582034606, 0.15299056702695601, 0.15429151121870616, 0.1537345074815858, 0.1530890687308844, 0.15503635677837946, 0.15586890307549972, 0.15126837853129452, 0.1518022697799149, 0.1512863971760413, 0.153703143703168, 0.15406598335503863, 0.15236557599026343, 0.15045496685725288, 0.15300695560448768, 0.150750732048514, 0.1516047752654648, 0.1495606119340082, 0.1505007631495052, 0.14960721833003046, 0.1486042550650274, 0.14973910041674565, 0.1524961694410861, 0.15087313616540507, 0.1498202047464885, 0.1497116760261467, 0.1478765352855828, 0.14944495653826423, 0.14948989161865925, 0.1507595415362839, 0.14804300199972548, 0.14827457554463025, 0.14916712001010837, 0.14792643354694396, 0.15122433540969074, 0.14946778335894975, 0.14585587293981622, 0.14533554354097133, 0.1488502203666225, 0.14958147341277664, 0.150983098144766, 0.14986528015335676, 0.14569230158577012, 0.14839119044483245, 0.1474882653499895, 0.15106345636954402, 0.14339475116248063, 0.1473850377438785, 0.14819164140819985, 0.14851089971696255, 0.14805913389113984, 0.14998288094919496, 0.14955124859194496, 0.146395573977892, 0.14768700162583234, 0.1475926937807124, 0.14800528510998806, 0.14665317894667873, 0.14722543957272863, 0.14621640972415745, 0.14533209241729347, 0.14575106036688748, 0.14758120663728044, 0.14700367669278872, 0.14439586697470366, 0.14724177906663816, 0.14622258599668544, 0.14807977540267733, 0.14604330683714542, 0.14848975014545865, 0.14498187500433535, 0.14535513611076378, 0.14784782854347658, 0.14581205689411775, 0.1467140886579208, 0.14560359691285402, 0.14593640151128462, 0.14375969528137147, 0.14458749813740324, 0.14327121578163407, 0.14627066010452922, 0.147705182851507, 0.14444838595328674, 0.14675118904719603, 0.14563932756733627, 0.14580516375297967, 0.14607458101334306, 0.1440458365433676, 0.1470949352077517, 0.14647100315321773, 0.14253853929234614, 0.14509986841646047, 0.14468863712001404, 0.14405525501012018, 0.14459876254975665, 0.1463390997873103, 0.14611179156283538, 0.14508986493915668, 0.14570935693015358, 0.14465366227673218, 0.14523998574644956, 0.14665443648626808, 0.14252379072832247, 0.14671868628680348, 0.14624636766572763, 0.14683005319568965, 0.14471416681508484, 0.14596948506956448, 0.14825906639366357, 0.14484098490124267, 0.14762759404419873, 0.14680652565941857, 0.14556037808841882, 0.14689595254848903, 0.14729420740766827, 0.14618019989196396, 0.14382454771695807, 0.14843854594481037, 0.14622705681928175, 0.14607933288494662, 0.15087727144702498, 0.14686115308459421, 0.14832287339412023, 0.14847547491293592, 0.14711773978536655, 0.14602378538347857, 0.14899231550335032, 0.14748852161516662, 0.14515596327557384, 0.14914061380819338, 0.1530278474745529, 0.15012115809690288, 0.14834724524977155, 0.1496776070090659, 0.1500176607572323, 0.14752554079941563, 0.1469943974642996, 0.1494051582156517, 0.14864526772574743, 0.1492640491151449, 0.1531299642003143, 0.14955576330990084, 0.14747121054113946, 0.1490632162481337, 0.15050459981446843, 0.15001481892610533, 0.15041263899197996, 0.14862927494319872, 0.1522519119655683, 0.14957646426772758, 0.1480500627929035, 0.14920505481437896, 0.1480352509646078, 0.14919676920588806, 0.14878060387212674, 0.14769001449468142, 0.1515767154265375, 0.14958539846494867, 0.15253314824287628, 0.1513780225290051, 0.150855843271386, 0.15252178457079615, 0.1511002166497468, 0.15067389764247738, 0.15076939226365188, 0.15212574253714597, 0.15150820473552104, 0.15383101305799565, 0.1533949983984686, 0.15470326866988718, 0.15385000154191442, 0.15484922614490648, 0.15261970638388264, 0.1586465857254712, 0.15710132260598775, 0.1546673776329851, 0.15632863241902212, 0.15674369697773508, 0.1567934408511826, 0.15729348536153864, 0.1593824805011531, 0.15938749152239923, 0.15876757368674832, 0.1589601640612345, 0.15721040902759964, 0.15842225547800842, 0.1609542595543253, 0.1574247218801092, 0.16086126877916243, 0.15930654522854187, 0.15896836192080335, 0.16367915650982887, 0.16128568129035886, 0.1600150374516725, 0.16111729233714087, 0.16288577696170878, 0.1615253083433809, 0.16171103225403324, 0.16092858257943785, 0.16045537398724247, 0.16129047022837134, 0.16276573585008192, 0.1623985863194542, 0.16056948404672722, 0.16485847638640988, 0.16295593225891036, 0.1657523593743458, 0.16306773724805518, 0.16291081833003906, 0.16237909987313096, 0.16461737233840099, 0.16527768678472224, 0.16449990903552342, 0.16571931514354887, 0.1646403158104694, 0.16488399272645712, 0.16551221029074664, 0.16346427295195953, 0.16438685608994894, 0.16593404290815308, 0.16420726938004035, 0.16615130194627822, 0.16377955493413796, 0.16665922710398307, 0.16556714585601198, 0.16719675937583525, 0.1647509716232206, 0.16267193239261116, 0.16706680295368762, 0.1678505779685131, 0.16569294024524844, 0.16615780209358536, 0.1648591153175893, 0.16788522785375443, 0.16837332697642532]
vr_data = [13.82662095801974, 13.727338479999037, 13.872028156360182, 13.89097722360955, 13.93079046840606, 14.117091969097805, 14.007503068625477, 13.925565413664097, 13.739917094107694, 13.747188043237106, 13.978979569766786, 13.574605431471754, 13.990905513790166, 13.96002904410396, 13.70886189205939, 13.759492681029723, 13.640920576294851, 13.856586758412432, 13.657711615706287, 13.702778063322894, 13.701688541193278, 13.841395600875192, 13.768405547182018, 13.849603011152249, 13.860982669810255, 13.683970849541081, 13.703250002795402, 13.655622208563473, 13.790250649327367, 13.555006538420947, 13.522777599414653, 13.711639048541572, 13.704675998068042, 13.782192531727846, 13.44261432345077, 13.731141125496682, 13.600339973157391, 13.665104265677805, 13.548694265339531, 13.517548558501211, 13.476430732256533, 13.663594931217665, 13.771728728992521, 13.528146826540073, 13.535934920912691, 13.782852554077301, 13.613552390861283, 13.354325403575539, 13.32176870970082, 13.511079538768547, 13.58032656050174, 13.685659587824995, 13.543232115222926, 13.49094067676939, 13.640508455156088, 13.367769140048585, 13.652796291283892, 13.732512903934802, 13.470173402242676, 13.606388477172835, 13.476129919156982, 13.548608986473711, 13.443687579579803, 13.444725048418903, 13.435103470585549, 13.44724677981098, 13.441266861690057, 13.68717384312351, 13.519092479007957, 13.481228640993209, 13.629512407533564, 13.660645454710139, 13.54424231525261, 13.645917480001176, 13.602935586739083, 13.250413181435736, 13.295024227313592, 13.69394680002025, 13.538782274877168, 13.368684690417007, 13.400474388755645, 13.428642608948266, 13.402948883538555, 13.690080636676692, 13.59748140203574, 13.472073704953212, 13.462342580002206, 13.43400350260611, 13.592295559740863, 13.61918975659826, 13.454592346223794, 13.668886601144717, 13.726841038732488, 13.624610971071075, 13.424158686115378, 13.292199034862543, 13.293065867256695, 13.538275943495378, 13.356308287980871, 13.570029294410038, 13.480765597615765, 13.680774628051605, 13.75470496924557, 13.183622056082433, 13.436414987417953, 13.52267561086628, 13.617558268419447, 13.620901357069675, 13.210873677894178, 13.55039562755969, 13.586878502065698, 13.326662326818989, 13.688582890402012, 13.813414625467601, 13.730289497473393, 13.548704547610853, 13.45995493386112, 13.542033816270873, 13.572266558693807, 13.747680850429639, 13.784699726019811, 13.723070891878093, 13.568080961336008, 13.644833985172507, 13.58207838072781, 13.692541902086312, 13.491185673577942, 13.693821850264174, 13.710242640047648, 13.843356667013328, 13.862157607943255, 13.761665998833802, 14.040176636398161, 13.54754114095931, 13.80258795963324, 13.623226185769315, 13.901350308204881, 13.52641782065854, 13.82057571885481, 13.767718964341736, 13.896217173118462, 13.837016792699385, 13.820528052451293, 13.69768555800112, 13.791494535462824, 13.959346524073357, 13.708163898969973, 13.825097668288558, 13.733631249189699, 13.898643477213659, 14.061408698348341, 13.732559845279983, 13.719147546371538, 13.809566791480085, 13.936773117445592, 13.889551613318645, 13.647855716084608, 14.129341961772887, 13.947208540984011, 14.0460361339646, 13.771710574955383, 13.988306736314192, 13.982003499371917, 13.922985916806823, 13.900692352798552, 13.86371776981061, 13.827053128339903, 13.778945858164118, 13.549981338338117, 13.657828327479388, 13.746200917236154, 13.880030900802558, 13.779368582752035, 13.882506695496401, 13.771359244227146, 13.946053024510544, 13.910035481224245, 13.548784720174176, 13.638858366527439, 13.612229746782145, 13.367445510846593, 13.82615872488761, 13.557151938384301, 13.896799104122575, 13.71202698096011, 13.772125684459011, 13.622667788924334, 13.813930602538376, 13.825239398666634, 13.94028515549426, 13.918971223275657, 13.917684046657994, 13.744620804888239, 14.020482744454362, 13.70593641834277, 13.647977024240912, 13.91775619937787, 13.871838349138885, 13.721353842646248, 13.736683581364538, 13.922571547561605, 13.726282076682708, 13.817083101551953, 13.790334237673413, 13.791001873717057, 13.844493971965036, 13.463635132069912, 13.594782607176361, 13.8708099896856, 13.6761915748309, 13.564777036030062, 13.82520628673212, 13.788619110722818, 13.829007498216896, 13.654113505196031, 13.757374310411633, 13.552763507029455, 13.842516240224095, 13.501579083841602, 13.743441496197473, 13.442141809105804, 13.659114056716843, 13.677922820214564, 13.878883788096203, 13.593862605990306, 13.728990883794788, 13.944225550496315, 13.813681238759099, 13.711748822634616, 13.871628633250824, 13.706880851831825, 13.694392651739488, 13.397011660290726, 13.530895144057379, 13.64002423095778, 13.908566587515125, 13.556872378944432, 13.620967506684933, 13.667557697206503, 13.6374878702054, 13.548920978906544, 13.7811844237869, 13.543360545520107, 13.598045747446113, 13.510328999789806, 13.654252190244579, 13.804186460079453, 13.652116058383395, 13.636745883109919, 13.398429987699618, 13.532050970215527, 13.690074876230824, 13.509700920067019, 13.641631584644085, 13.70927252322885, 13.587547197101602, 13.690655280814743, 13.505392223144952, 13.216847284940261, 13.477638791678672, 13.514946242793826, 13.392106921273149, 13.477190040811532, 13.548244151448273, 13.398762127426236, 13.185537891580479, 13.446494634603615, 13.528225125408735, 13.34007573970838, 13.282512693984536, 13.581866892791451, 13.613262774772418, 13.461079791783611, 13.464356297464795, 13.545688705883189, 13.56245021058632, 13.519183754159682, 13.414669974580937, 13.719715724047516, 13.517025101841313, 13.574248520428872, 13.561371339131874, 13.51393703064785, 13.62938106426076, 13.677185582446858, 13.854197178429322, 13.666215988709547, 13.549158204570203, 13.668444148767916, 13.597513080633206, 13.623955410010423, 13.520374203117859, 13.454880343402516, 13.644679148527297, 13.683538192055892, 13.419469174816058, 13.555083312722532, 13.490318736543802, 13.47539962586433, 13.208228050419123, 13.677316050881933, 13.502946636608531, 13.678069799968922, 13.601248266502637, 13.397101550803596, 13.656951288081249, 13.572428340697549, 13.570406696888853, 13.73585783158249, 13.413058919873079, 13.345142942514492, 13.529303565561047, 13.72990901702595, 13.76436149394222, 13.678166225079952, 13.68620854735112, 13.52798159812254, 13.494273389606809, 13.43502441175166, 13.457439241177415, 13.39077287721093, 13.774112566442481, 13.680539112347585, 13.576591563541614, 13.516834878413926, 13.44484616539562, 13.49336291776351, 13.62521855523102, 13.472825707663645, 13.49499015148682, 13.456600633817748, 13.433795807071622, 13.398154969582109, 13.570557543448416, 13.356472767532454, 13.70893441598209, 13.526270313747629, 13.664753190669622, 13.615007356973308, 13.474806447159956, 13.343404367981794, 13.602401024696682, 13.502330005890277, 13.464973522280324, 13.558539301325114, 13.386031987530835, 13.68041635852307, 13.647025754094665, 13.640358825011779, 13.725018706929289, 13.639289697508328, 13.434161683544273, 13.77875995963464, 13.432694653511218, 13.769016562879372, 13.365860657588698, 13.415697331418606, 13.698690318331114, 13.75156734387576, 13.892225076660031, 13.607995334798927, 13.587922949544023, 13.963305977777638, 13.62877820107125, 13.934934814773955, 13.53748726334051, 13.749689797473973, 13.979351726802832, 13.97980920638437, 13.954307597970447, 13.854940891200654, 13.797257634909156, 13.770610030251996, 14.055160099312323, 13.963960841929445, 14.100226278790918, 13.92558093590301, 14.084485445678359, 14.00163903844341, 13.851389566041034, 14.166910699726843, 14.03669577106989, 13.907556384567256, 14.05991301576172, 14.162073747635823, 13.918318717415644, 14.215239658379327, 13.898269589372962, 13.826575240430788, 13.822940106226639, 13.916614723041773, 13.961717867838066, 14.093630024520413, 14.248773770469832, 14.075734304160553, 13.963961664774645, 14.399722759635223, 13.939149510770399, 14.019165994447379, 14.115574026990304, 14.024998049846902, 13.902246275030459, 13.961484366479457, 13.987931203919684, 14.268334030108418, 14.131809541613945, 14.027399715497186, 14.224633065826078, 14.008079718956525, 14.10907617662814, 14.20501393206895, 14.031713333832542, 14.284603920714972, 13.775996644948929, 14.139247865448349, 13.900382341710563, 14.007644529292238, 14.05617772845588, 14.142926524871937, 13.920836801787399, 13.987990845038315, 14.038210232095134, 14.12419829283127, 14.049372767292779, 14.013680943362475, 13.956534769552249, 14.055295352871495, 13.935458898706635, 14.217677021887967, 14.252064290012017, 13.983972815163094, 13.98273401191357, 13.736794864745473, 14.104538967475419, 14.051632336529778, 14.00738951841163, 13.900149916425367, 13.892743799727283, 14.028929690942235, 14.151598424971004, 14.269964682074185, 13.849645027432697, 14.152253220013613, 14.033175001191, 13.845754853905316, 13.886513528631346, 14.123335199280268, 14.148267284576226, 13.789640834688575, 14.101318995141373, 13.940222794494563, 13.706262234180732, 14.041942741982531, 13.91030640623009, 13.908093416573216, 13.931932623955872, 14.048834580310507, 14.031122694339764, 14.122500688555828, 14.030555407714134, 14.180123630011012, 13.789276721658089, 14.017908535319208, 13.853068659939707, 13.892907958305805, 13.851196520747983, 14.095758526906625, 14.248129847793082, 14.183709979887556, 13.963065986263123, 13.817682786735073, 13.903201133863828, 14.032705232902384, 13.833287187079954, 14.134000835578588, 14.164384334891855, 14.249981428371196, 13.732480588649882, 14.052190424711211, 14.111272050331097, 14.013748568425061, 13.846264582271887, 14.139787280459382, 13.999275238220209, 13.923221688933207, 14.09941755347629, 13.900383722019631, 13.794629147610122, 14.127635241880823, 13.7666824183554, 14.299046716807094, 14.046846040341734, 14.303845610504265, 13.99398591089842, 13.81906560450296, 13.940464475134734, 13.90608803852555, 14.110436296135996, 14.112535171110425, 14.013646894115336, 13.879773546620383]

Did you try using DataInterpolations to make the data into a function?

  1. In that case, could those interpolation functions (Tg_interp, θ_interp, vr_interp) be written inside wind_turbine(u, p, t) as shown below? It seems to give me the same error.
# Define your wind turbine model with exogenous inputs
function wind_turbine(u, p, t)
    # Extract model parameters
    c1, c2, c3, c4, c5, c6 = p
    
    # Extract current state
    wr = u
    
    # Define constants
    μd = 0.05
    Rr = 120.998
    ρ = 1.225
    Ar = π * Rr^2
    Jr = 321699000
    Jg = 3.777e6
    
    # Calculate tip-speed ratio λ
    λ = wr * Rr / vr_interp(t)
    
    # Calculate 1/λi (numerical approx. for Cp(θ, λ))
    λi_inv = 1 / (λ + 0.08 * θ_interp(t)) - 0.035 / (θ_interp(t)^3 + 1)
    
    # Calculate Cp (numerical approx. for Cp(θ, λ))
    Cp = c1 * (c2 * λi_inv - c3 * θ_interp(t) - c4) * exp(-c5 * λi_inv) + c6 * λ
    
    # Evaluate differential equation for rotor speed
    dwr = ((1 - μd) * 1 / (2 * wr) * ρ * Ar * vr_interp(t)^3 * Cp - Tg_interp(t)) / (Jr + Jg)
    
    return dwr
end

And then defining the interpolation functions as:

Tg_interp = BSplineApprox(Tg_data, t_data, 9, 10, :ArcLen, :Average)
θ_interp = BSplineApprox(θ_data, t_data, 9, 10, :ArcLen, :Average)
vr_interp = BSplineApprox(vr_data, t_data, 9, 10, :ArcLen, :Average)
  1. Apart from that, I’ve just realized that dataset in the original code also contains the exogenous inputs, and it should probably only include the state variable (wr) and time. Nonetheless, the same error keeps appearing even when I change this.

I’ve realized that I had forgotten to change the output of the neural network to 1 (my state variable, wr) in:

chain = Lux.Chain(
            Lux.Dense(1, n, Lux.σ),
            Lux.Dense(n, n, Lux.σ),
            Lux.Dense(n, n, Lux.σ),
            Lux.Dense(n, 1)
        )

The code runs correctly if that is changed, along with the modification in datasets mentioned in my previous comment and the input interpolation mentioned by Chris Rackauckas. There’s probably some issue with this implementation as the results are not the expected ones, but that’s another story. I will open another topic to continue the discussion and add the link below.