EquivariantOperators.jl implements in Julia fully differentiable finite difference operators and convolutions on scalar or vector fields in 2d/3d. It can run forwards for FDTD simulation or image processing, or back propagated for machine learning, image classification or inverse problems. Emphasis is on symmetry preserving rotation equivariant operators, including differential operators, common Green’s functions & parametrized neural operators. Supports possibly nonorthogonal or periodic grids. Feedback appreciated
how does it compare to
- GitHub - e3nn/e3nn: A modular framework for neural networks with Euclidean symmetry
- GitHub - QUVA-Lab/escnn: Equivariant Steerable CNNs Library for Pytorch
-
GitHub - lucidrains/egnn-pytorch: Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch
etc.
btw the Edit on GitHub
button is broken; for ppl want to look at the code: GitHub - aced-differentiate/EquivariantOperators.jl
and can you comment on if this should one day join force with GitHub - FluxML/GeometricFlux.jl: Geometric Deep Learning for Flux or?
e3nn, egnn and GeometricFlux all use graph (point cloud) data whereas this uses grid array data, making it comparable to escnn. So it’s like a CNN but with radially symmetric kernels of possibly vector (or tensor) values. We have very good image classification and pose estimation results that we’ll release soon.
Since differential operators are also equivariant, we included prebuilt finite difference operators so people can also use this for FDM or FDTD stuff. So originally an ML package that also does FDM lol
To add equivariance features to GeometricFlux, all you need is to put spherical harmonics features onto the nodes and define your graph convolution to use custom products. Maybe someone can write a simple real-valued spherical harmonics library with all the product rules of CG coeffs. (EDIT: or push product rules into SphericalHarmonics.jl)