GeometricFlux.jl - Is there a true variable graph layer?

I’m looking into using GitHub - yuehhua/GeometricFlux.jl: Geometric Deep Learning for Flux. I notice that as per docs page Graph passing · GeometricFlux, to pass a variable graph you need to define the GCNConv layer by passing in a FeaturedGraph . However, the FeaturedGraph is constructed by taking in an adjacency matrix, meaning that you need to know the graph structure before defining the GNN. Am I missing something and does functionality exist so that a given GNN can take in more than one kind of graph structure?

I think the message passing layer should work for any graphs. The docs are difficult to read for me. I have once digged deeper to compare the speed of message passing to my implementation (CPU only) based on Mill.jl (GNNs in 16 lines · Mill.jl) and it was working for any graph.

What did you pass in when defining the GCNConv layer? Was it just defined as any adjacency matrix? That seems a bit odd.

@Tomas_Pevny Sorry for the reading difficulty of documentation. There must be some room to improve.

@emsal When using a variable graph, you don’t need to give a FeaturedGraph to construct a GCNConv layer. Just use it as input for GCNConv layer. Use FeaturedGraph to contain your feature with different graph structure.

Thank you, that makes sense. Are there any significant effects that come from the choice of the initial adjacency matrix used to define the GCN layer?

If you would like a static graph, which means that you pass the same graph structure, to pass through your GNN model, I suggest using initialize GCN layer with a adjacency matrix. The normalized Laplacian matrix is pre-computed to reduce the computational effort.

1 Like