# Issue adapting NeuralPDE.jl documentation example code to use case

Hello everyone, I am attempting to adapt the code from [an example in the NeuralPDE documentation], but am running into trouble.

I am still fairly new at using Julia, so I apologize in advance if the solution is trivial.

The code I am running is:

``````using NeuralPDE, Lux, Optimization, OptimizationOptimJL
import ModelingToolkit: Interval

@parameters x y t
@variables V(..)
Dx = Differential(x)
Dy = Differential(y)
Dt = Differential(t)

u0 = [3., 3.]
x0 = 3.
y0 = 3.

t0=0.
tf=30.

# Dynamical systems model parameters
#alpha, beta, gamma, delta, epsilon, K1, K2
#p = [1., .5, .5, 1.0, 1.0, 10., 10.]
alpha = 1.
beta = .5
gamma = .5
delta = 1.
epsilon = 1.
K1 = 5.
K2 = 5.

# Loss function parameters
c = 1.
a = 1.
b = 1.

# Depreciation rate.
r = 0.06

# 2D PDE
#eq = Dxx(u(x, y)) + Dyy(u(x, y)) ~ -sin(pi * x) * sin(pi * y)
eq = -Dt(V(t,x,y)) + (1/(4*c))*exp(-r*t)*Dx(V(t,x,y))^2 - a*x + b*y
- Dx(V(t,x,y))*(alpha*x*(1-x/K1)-beta*x*y)
- Dy(V(t,x,y))*(gamma*x*y - delta*y^2/K2 - epsilon*y) ~ 0

bcs = [V(t0,x,y)~0.0,
V(tf,x,y)~0.0,
Dt(V(t,10.0,y))~0,
Dt(V(t,0.0,y))~0,
Dt(V(t,x,10.0))~0,
Dt(V(t,x,0.0))~0]

domains = [t ∈ Interval(t0, tf),
x ∈ Interval(0.0, 10.0),
y ∈ Interval(0.0, 10.0)]

# Neural network
dim = 3 # number of dimensions
chain = Lux.Chain(Dense(dim, 16, Lux.σ), Dense(16, 16, Lux.σ), Dense(16, 1))

# Discretization
dx = 0.05
discretization = PhysicsInformedNN(chain, GridTraining(dx))

@named pde_system = PDESystem(eq, bcs, domains, [t, x, y], [V(t,x,y)])
prob = discretize(pde_system, discretization)

#Optimizer
opt = OptimizationOptimJL.BFGS()

#Callback function
callback = function (p, l)
println("Current loss is: \$l")
return false
end

res = Optimization.solve(prob, opt, callback = callback, maxiters = 1000)
``````

The stacktrace I get is:

``````ERROR: LoadError: type Num has no field lhs
Stacktrace:
[1] getproperty
@ ./Base.jl:37 [inlined]
[2] parse_equation(pinnrep::NeuralPDE.PINNRepresentation, eq::Num)
@ NeuralPDE ~/.julia/packages/NeuralPDE/OPjbo/src/symbolic_utilities.jl:307
[3] build_symbolic_loss_function(pinnrep::NeuralPDE.PINNRepresentation, eqs::Num; eq_params::SciMLBase.NullParameters, param_estim::Bool, default_p::Nothing, bc_indvars::Vector{Any}, integrand::Nothing, dict_transformation_vars::Nothing, transformation_vars::Nothing, integrating_depvars::Vector{Symbol})
@ NeuralPDE ~/.julia/packages/NeuralPDE/OPjbo/src/discretize.jl:60
[4] (::NeuralPDE.var"#300#314"{NeuralPDE.PINNRepresentation})(::Tuple{Num, Vector{Any}, Vector{Symbol}})
@ NeuralPDE ./none:0
[5] iterate
@ ./generator.jl:47 [inlined]
[6] collect(itr::Base.Generator{Base.Iterators.Zip{Tuple{Vector{Num}, Vector{Vector{Any}}, Vector{Vector{Symbol}}}}, NeuralPDE.var"#300#314"{NeuralPDE.PINNRepresentation}})
@ Base ./array.jl:782
[7] symbolic_discretize(pde_system::PDESystem, discretization::PhysicsInformedNN{GridTraining{Float64}, Nothing, NeuralPDE.Phi{Chain{NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}}, Nothing}, NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}}}}, typeof(NeuralPDE.numeric_derivative), Bool, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}})
@ NeuralPDE ~/.julia/packages/NeuralPDE/OPjbo/src/discretize.jl:542
[8] discretize(pde_system::PDESystem, discretization::PhysicsInformedNN{GridTraining{Float64}, Nothing, NeuralPDE.Phi{Chain{NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, Dense{true, typeof(sigmoid_fast), typeof(glorot_uniform), typeof(zeros32)}, Dense{true, typeof(identity), typeof(glorot_uniform), typeof(zeros32)}}}, Nothing}, NamedTuple{(:layer_1, :layer_2, :layer_3), Tuple{NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}, NamedTuple{(), Tuple{}}}}}, typeof(NeuralPDE.numeric_derivative), Bool, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}})
@ NeuralPDE ~/.julia/packages/NeuralPDE/OPjbo/src/discretize.jl:683
[9] top-level scope
@ ~/Development/OptimalControlProject/NeuralPDE_Test_postable.jl:63
[10] include(fname::String)
@ Base.MainInclude ./client.jl:478
[11] top-level scope
@ REPL[3]:1
in expression starting at ... NeuralPDE_Test_postable.jl:63
``````

This isn’t going to be what you think it is.

``````eq = -Dt(V(t,x,y)) + (1/(4*c))*exp(-r*t)*Dx(V(t,x,y))^2 - a*x + b*y -
Dx(V(t,x,y))*(alpha*x*(1-x/K1)-beta*x*y) -
Dy(V(t,x,y))*(gamma*x*y - delta*y^2/K2 - epsilon*y) ~ 0
``````

A took a closer look at the documentation and solved my problem.

I changed the first chunk of my code to:

``````
@parameters x y t
@variables V(..)
Dx = Differential(x)
Dy = Differential(y)
Dt = Differential(t)

Dx2 = Differential(x)^2

eq = -Dt(V(t,x,y)) ~ -(1/(4*c))*exp(-r*t)*Dx2(V(t,x,y)) + a*x - b*y
+ Dx(V(t,x,y))*(alpha*x*(1-x/K1)-beta*x*y)
+ Dy(V(t,x,y))*(gamma*x*y - delta*y^2/K2 - epsilon*y)
``````

Thank you for the help.