[ANN] GraphNeuralNetworks.jl

I am happy to announce GraphNeuralNetworks.jl (GNN.jl in short), a graph neural network library written in Julia and based on the deep learning framework Flux.jl. I have been working on it for the past few months and put a lot of effort into trying to address some of the current limitations of GeometricFlux.jl. Now I feel it is stable and documented enough for prime time.

GNN.jl comes with a large set of features:

  • Implements common graph convolutional layers.
  • Supports computations on batched graphs.
  • Easy to define custom layers.
  • CUDA support.
  • Integration with Graphs.jl.
  • Support for node-, edge-, and graph-level machine learning tasks.

In the examples folder you will find some scripts solving paradigmatic tasks:

  • Semi-supervised node classification: Given some feature vectors for each node in a graph and the labels of only a small subset of them, infer the labels of the remaining nodes.
  • Link prediction: Predict the existence of unobserved edges in a graph exploiting the observed topology and node features.
  • Graph classification: A classification task where inputs are graphs with associated node/edge features. Graphs have to be individually classified into different classes. Training is supervised, test is on graphs never seen during the training.

GNN.jl is largely inspired by PyTorch Geometric, Deep Graph Library, and GeometricFlux.jl.

33 Likes