Flux + Automatic Differentiation

Hi! I am trying to construct PINN to solve 1D Burgers equation in Flux.jl without using NeuralPDE.jl. The velocity field u(t,x) is defined by the neural net and I have to calculate the gradients with respect to t and x in order to satisfy the Burgers equation. Here is my code:

using Flux
using Flux: gradient

net_u = Chain(Dense(2 => 20, tanh), Dense(20 => 1))

u(t,x) = net_u([t,x])[1]
u_t(t,x) = gradient(u, t, x)[1]
u_x(t,x) = gradient(u, t, x)[2]
u_xx(t,x) = gradient(u_x, t, x)[2]

The first-order gradients are calculated correctly, but the second-order derivative results in error: “Can’t differentiate foreigncall expression”. How can we calculate the second-order derivatives in Flux.jl?

Is there an example of implementing a PINN network without using packages like NeuralPDE.jl?

Can you paste the full stacktrace?

Can't differentiate foreigncall expression $(Expr(:foreigncall, :(:jl_eqtable_get), Any, svec(Any, Any, Any), 0, :(:ccall), %5, %3, %4)).
You might want to check the Zygote limitations documentation.
https://fluxml.ai/Zygote.jl/latest/limitations


Stacktrace:
  [1] error(s::String)
    @ Base .\error.jl:35
  [2] Pullback
    @ .\iddict.jl:102 [inlined]
  [3] (::typeof(∂(get)))(Δ::Nothing)
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
  [4] Pullback
    @ C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\lib\lib.jl:67 [inlined]
  [5] (::typeof(∂(accum_global)))(Δ::Nothing)
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
  [6] Pullback
    @ C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\lib\lib.jl:78 [inlined]
  [7] (::typeof(∂(λ)))(Δ::Nothing)
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
  [8] Pullback
    @ C:\Users\parfe\.julia\packages\ZygoteRules\AIbCs\src\adjoint.jl:67 [inlined]
  [9] (::typeof(∂(λ)))(Δ::Nothing)
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
 [10] Pullback
    @ .\In[1]:6 [inlined]
 [11] (::typeof(∂(λ)))(Δ::Tuple{Nothing, Nothing, Float64})
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
 [12] Pullback
    @ C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface.jl:45 [inlined]
 [13] (::typeof(∂(λ)))(Δ::Tuple{Nothing, Float64})
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
 [14] Pullback
    @ C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface.jl:97 [inlined]
 [15] (::typeof(∂(gradient)))(Δ::Tuple{Nothing, Float64})
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface2.jl:0
 [16] Pullback
    @ .\In[1]:8 [inlined]
 [17] (::Zygote.var"#60#61"{typeof(∂(u_x))})(Δ::Float64)
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface.jl:45
 [18] gradient(::Function, ::Int64, ::Vararg{Int64})
    @ Zygote C:\Users\parfe\.julia\packages\Zygote\g2w9o\src\compiler\interface.jl:97
 [19] u_xx(t::Int64, x::Int64)
    @ Main .\In[1]:10
 [20] top-level scope
    @ In[2]:1

You should try with Zygote.hessian

3 Likes