Hi I have been trying to solve the ODE below:
dc/dt = - Acexp(-E/(R*T(t)))
Then I scaled time and Temperature it turned into:
dc/dt_ = - Actim_maxexp(-E/(RT_(t)*T_max))
!Note that the temperature increases through time slowly for 30 minutes then increases fast till it reaches a peak then reduces.
!Note that the values of everything is in the code.
I got the Temperature and time data, therefore I used Akima Interpolator
to define T as a function of t (T(t)). The problem I am having is that my residuals never get’s under 1e4 and the results are false. I have tried some technics like transforming the equation into log to make the stiffness less but I have failed since the real c would be small Negative numbers at some points. I assume the problematic term is the exp term since it produces very small values near to 0.
How can I solve this stiff problem?
using XLSX
using DataInterpolations
using Plots
using Lux, Random, NeuralPDE
using LuxCUDA, ComponentArrays
using CUDA
using DifferentialEquations
using Plots
const gdev = CUDA.device()
#########################################
#Reading the data
xf = XLSX.readxlsx("avetempsec.xlsx")
xf
XLSX.sheetnames(xf)
sh = xf["Sheet1"]
T = sh[:,2]
#########################################
#time data generation 0 to 15000 seconds
l = []
for i in range(0,15000)
push!(l,i)
end
println(length(l))
tim = Array(l)
reshape(tim,(15001,1))
#########################################
#constant values of ODE
A_e = 5.14e25
E_e = 2.7e5
Rc = 8.314
#########################################
#defining time and Temperature
T = vec(T)
tim = vec(tim)
T = convert(Vector{Float64}, T)
tim = convert(Vector{Float64}, tim)
##########################################
T_max = maximum(T) # is 590 K
tim_max = maximum(tim) # is 15000 seconds
T_scaled = T/T_max
tim_scaled = tim/tim_max # initial (Min) temperature is 293.15 K
##########################################
# Defining Interpolation for defining T as a function of t
A = AkimaInterpolation(T_scaled, tim_scaled)
scatter(tim_scaled, T_scaled, label = "input data")
plot!(A)
#########################################
#defining ODE
u0 = 1
tspan = (minimum(tim_scaled), maximum(tim_scaled))
f(u, p, t) = - A_e * u * tim_max * exp(-E_e/(Rc*A(t)*T_max))
prob = ODEProblem(f, u0, tspan)
#########################################
#define the Network
rng = Random.default_rng()
Random.seed!(rng, 0)
chain = Chain(Dense(1, 10, Lux.tanh),Dense(10, 10, Lux.tanh),Dense(10, 10, Lux.tanh), Dense(10, 1))
ps, st = Lux.setup(rng, chain)
ps = ps |> ComponentArray |> gpu .|> Lux.Float64
#########################################
using OptimizationOptimisers
opt = Adam(0.001)
alg = NNODE(chain, opt, init_params = ps)
sol = solve(prob, alg, verbose = true, maxiters = 1000,saveat= 0.01)
Thanks for your time. @ChrisRackauckas