Implementing Laplacian operator acting on a 2D array

Hello,
I’m interested in implementing the laplacian operator \nabla^2=\partial_x^2+\partial_y^2, such that I can apply it to a matrix A_{ij} and obtain the corresponding matrix: B_{ij}=(\nabla^2A)_{i,j}. For example, the result of the action of the operator \nabla^2 on A, is given by a matrix B with elements:

B[i,j]=\frac{A[i+1,j]-2A[i,j]+A[i-1,j]}{dx^2}+\frac{A[i,j+1]-2A[i,j]+A[i,j-1]}{dy^2}
with periodic boudary conditions (of course this is just an approximation valid up to order dx^2+dy^2, and more accurate formulas can be used).
I was thinking of using DiffEqOperators.jl, but it seems that it is in the process of being deprecated. So does anyone have any suggestions on how I should implement it? Performance and efficiency are a big concern since I’m gonna use this in a SDE that I need to solve.
Thanks!

Don’t know how this fits on your efficiency concern as it (may) implies making a grid copy. Sorry for the weird syntax but this (grdmath) GMT.jl module is impossibly hard to convert into the nice keyword names syntax of the rest of the GMT.jl package.

using GMT

G = GMT.peaks();

Gc = gmt("grdmath ? CURV", G);    # The Laplacian

You can always just write a loop in Julia. Loops are fast. For example, this post give example code for \nabla^2 in N dimensions with a loop: Seemingly unnecessary allocation within for loop - #8 by stevengj … writing code that only works in 2d is even easier.

For periodic boundary conditions, the naive approach is to compute i\pm 1 modulo the size in the loop body, but this will slow things down a lot. A very flexible approach is to use “ghost cells” (padding) as in the code linked above, which lets the loop body not care about the boundary conditions, which are updated in a separate (cheap) loop over the boundaries.

You may want to have a look at ParallelStencil.jl. Alternatively, I like defining my grid as a graph with Graphs.jl (see the grid function), for which you can easily derive a Laplacian matrix with laplacian_matrix, retrieved as a sparse array.

Thanks for the reply! Yes, using loops was initially my approach. However, I’m solving a system of stochastic differential equations using DifferentialEquations.jl .
The reason why I need the laplacian of a matrix is because my deterministic drift function F involves that operator. To be more specific, my equation looks something like
\begin{equation} du=f(u,du,t)dt+gdW \end{equation}
where u is a matrix, and F involves \nabla^2 u.
So I need to implement the laplacian operator inside the f function, and I think vectorized operations are preferred than loops in this case.

Loops can be just as fast as (and often faster than) vectorized operations, regardless of whether they are in your ODE function. An advantage of vectorized operations is that they can be sometimes easier for AD tools to handle, if you are using an implicit method that requires the Jacobian of your rhs.

I would use Tullio.jl for that.

On CPU it offers very good multi-threaded and gradient performance.

julia> using Tullio

julia> A = randn((10, 10));

julia> Δ(A) = @tullio B[i+_, j+_] := (A[i+1,j] - 2* A[i,j] + A[i-1, j]) + (A[i,j+1] - 2* A[i,j] + A[i, j-1])
Δ (generic function with 1 method)

julia> @time Δ(A);
  0.457992 seconds (270.59 k allocations: 17.566 MiB, 100.00% compilation time)

julia> @time Δ(A)
  0.000007 seconds (9 allocations: 864 bytes)
8×8 Matrix{Float64}:
 -0.0784055  -1.64834   -2.44478     0.0761991  -1.45282   3.00903  -3.56879   -2.53072
 -2.7343      4.63921   -1.05534    -0.246098   -2.90363  -1.33712   5.40146    3.40203
  4.6931      1.84561   -0.943601   -1.93862     8.28114  -2.25236  -2.94403   -1.22829
  3.99363    -4.24829   -0.167599   -0.433333    2.48944  -6.43314   3.48072    0.928807
 -6.78706    -1.48015    4.1498      0.27931    -4.54315  -3.31616  -0.214246   5.33025
 -1.26794     2.69559   -3.03843     3.8154      2.26288   8.51866  -4.4771    -6.78285
  4.50582    -3.40961    3.5233     -5.4003     -2.2904   -2.97497  -1.12548    4.95231
 -0.801559    0.711517  -0.0328071  -0.444737    3.89509  -6.35943  10.4327     2.36326
``
3 Likes