# PDE with neural network and derivative of NN using MethodOfLines

I want to solve the energy equation where the potential energy is expressed by a neural network and then solved for a prescribed energy (actually temperature) profile. So a very simplified version of my problem would look like this. It would then become more complicated with fluid properties, variable velocity etc.

``````using ModelingToolkit
using MethodOfLines, DomainSets
using DifferentialEquations
using Symbolics
#using Lux

##
@variables x, t
@parameters v
@variables e(..)

Dx = Differential(x)
Dt = Differential(t)

##
params = [
v => 1.5
]

##
t_min, t_max = (0, 120.0)
x_min, x_max = (0, 1.0)
domains = [
t ∈ Interval(t_min, t_max),
x ∈ Interval(x_min, x_max)
]

##
q_dot(x) = 100 * x
NN(x) = "Some neural network"

##
eqs = [
Dt(e(x, t) + NN(x)) + Dx(v*(e(x, t) + NN(x))) ~ q_dot(x) + v * Dx(NN(x))
]

##
bcs = [
e(x, t_min) ~ 1e4
e(x_min, t) ~ 1e4
]

##
@named pdesys = PDESystem(eqs, bcs, domains, [x, t], [e(x, t)], params)
discretization = MOLFiniteDifference([x => 50], t, advection_scheme=UpwindScheme())
prob = discretize(pdesys, discretization)

##
sol = solve(prob, Rodas4())
``````

Is this Training a neural network within method of lines and ODE solve framework still the best approach to take?
I picked somwhere up that Lux is preferred within ModelingToolkit over Flux?
Can ModelingToolkit and MethodOfLines take care of the term?

``````Dx(NN(x))
``````

or do I have to take care of it manually building the derivates and then using

``````@register_symbolic
``````

and

``````Symbolics.derivative
``````

Where would I then find a good example doing such a thing?

I already looked at using Friction Model · ModelingToolkitNeuralNets.jl
but with the equation system looking like this

``````eqs = [
Dt(e(x, t) + nn_in.u[1]) + Dx(v*(e(x, t) + nn_in.u[1])) ~ q_dot(x) + v * Dx(nn_in.u[1])
x ~ nn_out.u[1]
]
``````

My system becomes overdetermined. I asume b/c the neural net input variable is the independent variable and not dependent as in the example.

Or should I look in the direction of NeuralPDE.jl instead of MethodOfLines overall?
Any input would be apreciated and thanks in advance

In the meantime I was able to solved the issues with the neural net and it’s derivative this is a working example with just the constant initial parameters

``````using ModelingToolkit
using MethodOfLines, DomainSets
using DifferentialEquations
using Symbolics
using ForwardDiff
using Lux
using LuxCore: stateless_apply
using Random: Xoshiro
#using ComponentArrays: ComponentArray
#using SciMLStructures
#using SciMLStructures: Tunable
##
@variables x, t
@parameters v ρ cp θ
@variables T(..)

Dx = Differential(x)
Dt = Differential(t)

##
chain = Lux.Chain(
Lux.Dense(1 => 10, Lux.mish, use_bias = false),
Lux.Dense(10 => 10, Lux.mish, use_bias = false),
Lux.Dense(10 => 1, use_bias = false)
)

##
params = [
v => 1.5
ρ => 2.6
cp => 1200e3
#θ => Lux.initialparameters(Xoshiro(0),chain)
]

##
t_min, t_max = (0, 120.0)
x_min, x_max = (0, 1.0)
domains = [
t ∈ Interval(t_min, t_max),
x ∈ Interval(x_min, x_max)
]

##
q_dot(x) = 100 * x
#NN(x) = x^2
#dNNdx(x) = ForwardDiff.derivative(NN, x)

nn(x,θ) = first(stateless_apply(chain, [x], θ))
dnn(x,θ) = first(ForwardDiff.derivative(x -> nn(x,θ), x))
##
#initθ = Lux.initialparameters(Xoshiro(0),chain)
#nn(2.0,initθ)
#dnn(2.0,initθ)
#@register_symbolic nn(x,θ)
#@register_symbolic dnn(x,θ)
#Symbolics.derivative(::typeof(nn), args::NTuple{2,Any}, ::Val{1}) = dnn(args[1],args[2])

initθ = Lux.initialparameters(Xoshiro(0),chain)
nn(x) = first(stateless_apply(chain, [x], initθ))
dnn(x) = first(ForwardDiff.derivative(nn, x))
## Test the NN
nn(2.0)
dnn(2.0)
##
@register_symbolic nn(x)
@register_symbolic dnn(x)
Symbolics.derivative(::typeof(nn), args::NTuple{1,Any}, ::Val{1}) = dnn(args[1])

##
eqs = [
#Dt(ρ*cp*T(x, t) + nn(x,θ)) + Dx(v*(ρ*cp*T(x, t) + nn(x,θ))) ~ q_dot(x) + v * Dx(nn(x,θ))
Dt(ρ*cp*T(x, t) + nn(x)) + Dx(v*(ρ*cp*T(x, t) + nn(x))) ~ q_dot(x) + v * Dx(nn(x))
]

##
bcs = [
T(x, t_min) ~ 393.15
T(x_min, t) ~ 393.15
]

##
@named pdesys = PDESystem(eqs, bcs, domains, [x, t], [T(x, t)], params)
discretization = MOLFiniteDifference([x => 50], t, advection_scheme=UpwindScheme())
prob = discretize(pdesys, discretization)

##
sol = solve(prob, Rodas4())
``````

But I still struggle to make the neural net parameters part of the modelingtoolkit parameters and in result setting up the optimization problem. As the workflow in Friction Model · ModelingToolkitNeuralNets.jl seems rather sophisticated and I fail to understand

We haven’t connected ModelingToolkitNeuralNets.jl and MethodOfLines.jl or NeuralPDE.jl yet. We’re getting close but not there.