Hi everyone,
I would like to discuss the best practices for automatic generation of Jacobians when solving PDEs. I understand that the ‘way to go’ is to (1) define a residual function (in loop-style - pure Julia), (2) use SparsityDetection to get the pattern (assuming we don’t know a priori), (3) generate coloring and (4) finally use forwarddiff.
Let’s say I want to solve a 1D Poisson with variable coefficient. So, I would start by defining a residual function. For automatic differentiation routines the number of input parameters is somehow limited. I manage to get it working with either defining parameters (like the coefficient and grid spacing) as global variables which is not a very satisfying approach (see MWE1, just below).
using ForwardDiff, SparseDiffTools, SparseArrays, LinearAlgebra, SparsityDetection
# These parameters that should not not be defined globally ideally
nx = 11
k = ones(Float64,nx+1)
dx = 1.0
T = ones(Float64,nx)
##################################################
function Func!( f, T ) # 2 arguments needed, others can be made optional
for i in 2:length(T)-1 # a loop seems essential here, the vectorised version does not appear to work
qW = -k[i] * (T[i] - T[i-1]) / dx
qE = -k[i+1] * (T[i+1] - T[i]) / dx
f[i] = (qE - qW) / dx
end
f[1] = -2T[1] + T[2]
f[end] = T[end-1] - 2T[end]
nothing
end
##################################################
function main()
# Sparsity pattern
input = rand(nx)
output = similar(input)
sparsity = jacobian_sparsity(Func!,output,input)
jac = Float64.(sparse(sparsity))
# Makes coloring
colors = matrix_colors(jac)
# Evaluate Jacobian
forwarddiff_color_jacobian!(jac, Func!, T, colorvec = colors)
# Compute residual
f = zero(T)
Func!( f, T )
println("Before solve: norm f = ", norm(f))
# Newton step
dT = -jac\f
T .+= dT
# Compute residual
Func!( f, T )
println("After solve: norm f = ", norm(f))
end
##################################################
main()
Another solution is to define all input parameters in the main routine (what I’d like to do), then these parameters need to be parsed in my residual evaluation routine. In order to work, these parameters need to be assigned to default dummy values which sounds a bit awkward. (these are not useful for the differentiation itself as long as they do not depend on the solution, but are essential for direct function evaluations). This is what is done in MWE2 below.
using ForwardDiff, SparseDiffTools, SparseArrays, LinearAlgebra, SparsityDetection
##################################################
function Func!( f, T, k=ones(Float64, length(T)+1), dx=1.0 ) # 2 arguments needed, others can be made optional
for i in 2:length(T)-1 # a loop seems essential here, the vectorised version does not appear to work
qW = -k[i] * (T[i] - T[i-1]) / dx
qE = -k[i+1] * (T[i+1] - T[i]) / dx
f[i] = (qE - qW) / dx
end
f[1] = -2T[1] + T[2]
f[end] = T[end-1] - 2T[end]
nothing
end
##################################################
function main()
# Configuration
# These parameters that should not not be defined globally ideally
nx = 11
k = ones(Float64,nx+1)
dx = 1.0
T = ones(Float64,nx)
# Sparsity pattern
input = rand(nx)
output = similar(input)
sparsity = jacobian_sparsity(Func!,output,input)
jac = Float64.(sparse(sparsity))
# Makes coloring
colors = matrix_colors(jac)
# Evaluate Jacobian
forwarddiff_color_jacobian!(jac, Func!, T, colorvec = colors)
# Compute residual
f = zero(T)
Func!( f, T, k, dx )
println("Before solve: norm f = ", norm(f))
# Newton step
dT = -jac\f
T .+= dT
# Compute residual
Func!( f, T, k, dx )
println("After solve: norm f = ", norm(f))
end
##################################################
main()
So the question is, are there better ways to define the residual functions and evaluate the Jacobian? I would like to know are the best practices to then further move to multi-dimensional problems including non-linearities…
I realize that this topic is close to that one but I’ve somehow not been able to apply the tricks discussed there to my example…
Thanks in advance for your hints!
Cheers