Mixed NN and functions fail with NeuralPDE

Hi.
I’m trying to use NeuralPDE to approximate a solution to a PDE containing both Neural nets and functions. Something goes wrong, can you help ? I am new to NeuralPDE, so it might just be a newbie error.

Here is the code:

import ModelingToolkit: Interval
@parameters x,y,z

L=5

domains = [x ∈ Interval(-L, +L),
           y ∈ Interval(-L, +L),
	   z ∈ Interval(-L, +L)]

@variables G(..)


Fx0(x,y,z)= -sin(atan(y,x))*exp(-(sqrt(x*x+y*y)-2.0)^2-z*z)
Fy0(x,y,z)= cos(atan(y,x))*exp(-(sqrt(x*x+y*y)-2.0)^2-z*z)
Fz0(x,y,z)= 0.0


Dx = Differential(x)
Dy = Differential(y)
Dz = Differential(z)

eqs0=[Dy(Fz0(x,y,z))-Dz(Fy0(x,y,z))-Dx(G(x,y,z))~0,
     Dz(Fx0(x,y,z))-Dy(Fz0(x,y,z))-Dy(G(x,y,z))~0,
     Dx(Fy0(x,y,z))-Dy(Fx0(x,y,z))-Dz(G(x,y,z))~0]

bcs0 = [G(-L,y,z)~G(L,y,z), G(x,-L,z)~G(x,L,z),G(x,y,-L)~G(x,y,L),
	Dx(G(-L,y,z))~Dx(G(L,y,z)), Dy(G(x,-L,z))~Dy(G(x,L,z)),Dz(G(x,y,-L))~Dz(G(x,y,L))]


input_ = length(domains)
n = 15
chain = Lux.Chain(Dense(input_, n, Lux.asinh), Dense(n, n, Lux.asinh), Dense(n, 1))

strategy = QuadratureTraining()
discretization = PhysicsInformedNN(chain, strategy)

@named pdesystem = PDESystem(eqs0, bcs0, domains, [x,y,z], G(x,y,z))
prob = discretize(pdesystem, discretization)
sym_prob = symbolic_discretize(pdesystem, discretization)

pde_inner_loss_functions = sym_prob.loss_functions.pde_loss_functions
bcs_inner_loss_functions = sym_prob.loss_functions.bc_loss_functions

callback = function (p, l)
    print("loss: ", l)
    print(" pde: ", map(l_ -> l_(p), pde_inner_loss_functions))
    println(" bcs: ", map(l_ -> l_(p), bcs_inner_loss_functions))
    return false
end
 
res = Optimization.solve(prob, Adam(0.5); callback = callback, maxiters = 5)

I get the following output:

loss: NaN pde: [4.9889631620980905, 4.507107273136388, NaN] bcs: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
loss: NaN pde: [NaN, NaN, NaN] bcs: [NaN, NaN, NaN, NaN, NaN, NaN]
loss: NaN pde: [NaN, NaN, NaN] bcs: [NaN, NaN, NaN, NaN, NaN, NaN]
loss: NaN pde: [NaN, NaN, NaN] bcs: [NaN, NaN, NaN, NaN, NaN, NaN]
loss: NaN pde: [NaN, NaN, NaN] bcs: [NaN, NaN, NaN, NaN, NaN, NaN]
loss: Inf pde: [4.9889631620980905, 4.507107273136388, NaN] bcs: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]```

Are you perhaps sending values to atan that are out of domain?

The domain for the atan I use is all real numbers. So, it doesn’t seem to be the problem.

You could use the new DebugLayer in Lux to try and see if the NaNs star appearing inside the Lux layer Debugging Lux Models | LuxDL Docs

When I try to use it I get the error:

julia> debug_chain=Lux.Experimental.@debug_mode chain
ERROR: LoadError: UndefVarError: `@debug_mode` not defined```

After updating Lux it seems to work.

After runing it I get ´´´
“NaNs detected in pullback output for Dense(15 => 1) at location chain.layers.layer_3!”)```

So, it seems to appear in the start of the backwards pass.
Is there a way to follow the Nans in the solve function ?
Or test the loss function or the differentiated loss function ?

The differential of the functions Fx0 and Fy0 have a singularity at (x,y) = (0,0).
Is that a way to remove that line form the domain ?
Something like ```

domains = [x ∈ Interval(-L, +L),
           y ∈ Interval(-L, +L),
	   z ∈ Interval(-L, +L),
(x,y) ∉ (0,0)]

But just works.