[ANN] OperatorLearning.jl: Functional mappings to solve parametric PDEs

Dear Community,

I’d like to share a package with you (my first one actually) that I created recently in order to learn (nonlinear) operators to solve PDEs: OperatorLearning.jl

This is basically a port from Zongyi Li’s Fourier Neural Operator and Lu Lu’s DeepONet that is currently implemented in DeepXDE.

I simply wanted to use these architectures in Julia, with some added flexibility and the nice syntax we all love :blush:

Last time I checked, this implementation of the FNO even does training a little faster than the original version on the Burgers equation example that Li and colleagues provide, thanks to the awesome work of the Flux.jl team :tada:

It’s far from complete and there are still some features that I would like to incorporate, most importantly the use of physics-informed losses to alleviate the amount of data needed for training - following the respective recent works 1 2.

I’m looking forward to your impressions! Of course, if you see something wrong in the code or with the package, feel free to let me know.

5 Likes

Looks like there is an existing implementation of some of these operator-learning methods (https://github.com/foldfelis/NeuralOperators.jl). Did your implementations end up being similar?

Hi, thanks for bringing it up. I sort of realized that afterwards as well. Both packages include the Fourier Neural Operator (the rest going in different directions though), but as far as I can tell the implementations are rather different when comparing the relevant sources of OperatorLearning vs. NeuralOperators side by side.

I’m not really sure about which one would perform better on a given problem, but I guess having choices doesn’t hurt :smile:

1 Like