Second Fick Law -- NeuralPDE

Hello everyone,

I’m trying to solve Fick’s second law (1D), which is analogous to the thermal diffusion equation (or heat equation). Fick’s equation explains how a solute diffuses in a solvent.

I was inspired by this code ( 1D Wave Equation with Dirichlet boundary conditions) so I tried this:

using NeuralPDE, Lux  
using Optimization, OptimizationOptimJL, Plots
import ModelingToolkit: Interval

@parameters t, x
@variables u(..)
Dxx = Differential(x)^2
Dt = Differential(t)

#1D-PDE
κ = 1                                  # Fick diffusion coefficient
eq = Dt(u(t,x)) ~ κ*Dxx(u(t,x))        # Fick second law

# Initial and boundary conditions
bcs = [u(t,0) ~ 0.,         # for all t > 0 -- BC
       u(t,1) ~ 0.,         # for all t > 0 -- BC
       u(0,x) ~ 2x,         # for all 0 ≤ x ≤ 1/2  and for t > 0 -- Init.Cond
       u(0,x) ~ 2*(1. - x)] # for all 1/2 ≤ x ≤ 1  and for t > 0 -- Init.Cond
     

# Space and time domains
domains = [t ∈ Interval(0.0,1.0),
           x ∈ Interval(0.0,1.0)]

# Discretization
dx = 0.01

# Neural network
inner = 20
chain = Lux.Chain(Dense(2,inner,Lux.σ),Dense(inner,inner,Lux.σ),Dense(inner,1))

strategy = GridTraining(dx)
discretization = PhysicsInformedNN(chain, strategy)

@named pde_system = PDESystem(eq,bcs,domains,[t,x],[u(t,x)])
prob = discretize(pde_system,discretization)

callback = function (p,l)
    println("Loss: $l")
    return false
end

# optimizer
opt = OptimizationOptimJL.BFGS()
res = Optimization.solve(prob,opt; callback = callback, maxiters=3000)
phi = discretization.phi

# Graphs
ts,xs = [infimum(d.domain):dx:supremum(d.domain) for d in domains]
analytic_sol_f(t,x) =  sum([exp(-t*(n*π)^(2)) * sin(n*π*x) for n in 1:2:50000])

u_predict = reshape([first(phi([t,x],res.u)) for t in ts for x in xs],(length(ts),length(xs)))
u_analyt = reshape([analytic_sol_f(t,x) for t in ts for x in xs], (length(ts),length(xs)))

diff_u = abs.(u_predict .- u_analyt)
p1 = plot(ts, xs, u_analyt, linetype=:contourf, yaxis = "x (u.a.)", title = "Analytic");
p2 = plot(ts, xs, u_predict, linetype=:contourf, xaxis = "t (u.a.)", yaxis = "x (u.a.)", title = "Prediction");
p3 = plot(ts, xs, diff_u, linetype=:contourf, title = "Error");
plot(p1,p2,p3)

The analytical solution is:


where Cₙ and κ can be normalized to 1, L = xₘₐₓ = 1, 0.0 ≤ t ≤ 1.0, 0.0 ≤ x ≤ 1.0 and n is 1:2:50000

There are 3 questions:

  • Q1: Do you think the way the initial condition is presented is fair (i.e. in 2 lines)?
  • Q2: Does the comparison between the “Analytic” graph and the “Prediction” graph seems reasonable to you, considering the “Error” graph?
  • Q3: As I have an Nvidia 3070 Ti graphic card, I tried to use CUDA. But before doing so, I tested the following code and obtained the following error:
ERROR: UndefVarError: `ComponentArray` not defined

Can you help me solve this error? Because when I adapt my code in order to use GPUs, I get the same error as reported above.

Thank you in advance for any help.
T.

Here the graphs obtained via the code as above:

Use https://github.com/SciML/NeuralPDE.jl/blob/master/docs/src/tutorials/gpu.md. I think the latest version of the code hasn’t been deployed to the documentation website.

u₀(x) = ifelse(x < 0.5, 2x, 2(1-x))
@register_symbolic u₀(x)

bcs = [u(t,0) ~ 0.,       
       u(t,1) ~ 0.,
       u(0,x) ~u₀(x) ]

Thank you very much, Yicheng Wu. :+1: :wink:
The way you have expressed the initial condition is perfect and elegant. The loss function runs much better (loss < about 2.04e-4).
So, this resolves Q1 and Q2.

Thank you very much avikpal. :+1: :wink:
Everything works perfectly as well concerning the code you mentionned and for the modification I incorporated in my code (second Fick Law).

Not an ad but you can try using Sophon.jl, it should be faster and more accurate. NeuralPDE is being rewritten, its old implementation has some shortcomings. Sophon is a rewritten version of NeuralPDE.

I’m already looking at the GitHub pages.
Thanks for getting my attention. Everything is going so fast with Julia now.

After trying sucessfully the code recommanded by Avik Pal:
NeuralPDE.jl/gpu.md at master · SciML/NeuralPDE.jl · GitHu
I modified my program as this:

using NeuralPDE, Lux, CUDA, Random, ComponentArrays
using Optimization, OptimizationOptimisers
using Plots
import ModelingToolkit: Interval

@parameters t, x
@variables u(..)
Dxx = Differential(x)^2
Dt = Differential(t)

#1D-PDE
κ = 0.75                               # Fick diffusion coefficient
eq = Dt(u(t,x)) ~ κ*Dxx(u(t,x))        # Second Fick law

# Initial and boundary conditions
u₀(x) = ifelse(x < 0.5, 2x, 2(1-x))
@register_symbolic u₀(x)
bcs = [u(t,0) ~ 0.,                    # for all t > 0 -- BC   
       u(t,1) ~ 0.,                    # for all t > 0 -- BC
       u(0,x) ~ u₀(x)]                 # for all 0 ≤ x ≤ 1/2  and for t > 0 -- Init.Cond
                                       # for all 1/2 ≤ x ≤ 1  and for t > 0 -- Init.Cond 
     
# Space and time domains
domains = [t ∈ Interval(0.0,1.0),
           x ∈ Interval(0.0,1.0)]

# Discretization
dx = 0.01

# Neural network
inner = 20
chain = Lux.Chain(Dense(2,inner,Lux.σ),
                  Dense(inner,inner,Lux.σ),
                  Dense(inner,1))

strategy = GridTraining(dx)
ps = Lux.setup(Random.default_rng(), chain)[1]
ps = ps |> ComponentArray |> gpu .|> Float64
discretization = PhysicsInformedNN(chain, strategy, init_params = ps)

@named pde_system = PDESystem(eq, bcs, domains, [t,x], [u(t,x)])
prob = discretize(pde_system, discretization)
symprob = symbolic_discretize(pde_system, discretization)

callback = function (p,l)
    println("Loss: $l")
    return false
end

# optimizer
#opt = OptimizationOptimJL.BFGS()
res = Optimization.solve(prob, Adam(0.001); callback = callback, maxiters=3000)
prob = remake(prob, u0 = res.u)
res = Optimization.solve(prob, Adam(0.001); callback = callback, maxiters = 3000)

phi = discretization.phi

# Graphs
ts,xs = [infimum(d.domain):dx:supremum(d.domain) for d in domains]
analytic_sol_f(t,x) = sum([exp(-t*κ*(n*π)^(2)) * sin(n*π*x) for n in 1:2:50000])

u_predict = reshape([first(phi([t,x],res.u)) for t in ts for x in xs],(length(ts),length(xs)))
u_analyt = reshape([analytic_sol_f(t,x) for t in ts for x in xs], (length(ts),length(xs)))

diff_u = abs.(u_predict .- u_analyt)
p1 = plot(ts, xs, u_analyt, linetype=:contourf, xaxis = "t (u.a.)", title = "Analytic");
p2 = plot(ts, xs, u_predict, linetype=:contourf, xaxis = "t (u.a.)", yaxis="x (u.a)", title = "Prediction");
p3 = plot(ts, xs, diff_u, linetype=:contourf, title = "Error");
plot(p1,p2,p3)

However I get this error:

ERROR: CuArray only supports element types that are allocated inline.
Real is not allocated inline

I cannot detect my error, unfortunately.
Thank you in advance for any help.

The step where the ERROR is thrown is:

prob = discretize(pde_system, discretization)

Up to the line as follows included

@named pde_system = PDESystem(eq, bcs, domains, [t,x], [u(t,x)])

REPL indicates::

PDESystem
Equations: Equation[Differential(t)(u(t, x)) ~ 0.75Differential(x)(Differential(x)(u(t, x)))]
Boundary Conditions: Equation[u(t, 0.0) ~ 0.0, u(t, 1.0) ~ 1.2246467991473532e-16exp(-7.4022033008170185t), u(0.0, 0) ~ sin(πx)]
Domain: Symbolics.VarDomainPairing[Symbolics.VarDomainPairing(t, 0.0..1.0), Symbolics.VarDomainPairing(x, 0.0..1.0)]
Dependent Variables: Num[u(t, x)]
Independent Variables: Num[t, x]
Parameters: SciMLBase.NullParameters()
Default Parameter ValuesDict{Any, Any}()

Then?..

I checked the pilote of my Nvida card and select 530 instead of 525. This resolves the problem.