Terminology: "multi branch neural network" or what?

Hello, which is the correct name for a neural network architecture where multiple “branches” are learn, possibly but not necessarily togher, and each branch “ends” with an encoding representation of the relative features in order to be decoded by subsequent layers and arrive to a final yhat?

Like the following one:

I am using “multiple-branch” in my tutorial, but I can’t find many references using this term, so I think there must be another name for it I can’t find it…

2 Likes

Search for backbone and see if you find something that matches well enough. A split in the other end of the network generally goes under the name multiple heads.

2 Likes

Yes, I see “multi-branch” as a purely descriptive term rather than something which can be used to classify different types of networks (e.g. CNN vs RNN vs MLP vs Transformer). That’s probably why a search for it doesn’t turn up any results—the term is sufficiently generic that it’s not a good label. So many modern NN architectures have branching because of residual and skip connections, and it would not be a stretch to throw techniques such as Mixture-of-Experts under this umbrella too.

2 Likes

From what I understand, you are looking for a generic term for what happens in the picture as opposed to a model where each node just takes a single input (and therefore has no “branches”), not something to label this particular neural network architecture with.

I’m no authority in the area, but I had to think about this more than I would have liked to when I made NaiveNASlib handle any architecture. I think multi-branch is fine. Frameworks tend to not go out of their way to separate this type of network from the special case where there are no branches. Closest I can think of is the docs for keras sequential vs functional API where they use terms like “non-linear topology” and “directed acyclic graph of layers”.

4 Likes

Thank you everyone… indeed the closest used term seems to be “backbone”, but it’s not one I like much, I’ll stay with the generic “multiple-branch” to describe such situation… thanks

When looking for a similar neural network architecture per se,
Try The Neural Network Zoo - The Asimov Institute
Neural Network Zoo | September 14 2016
With new neural network architectures popping up every now and then, it’s hard to keep track of them all. Knowing all the abbreviations being thrown around (DCIGN, BiLSTM, DCGAN, anyone?) can be a bit overwhelming at first. So I decided to compose a cheat sheet containing many of those architectures. Most of [Read More]

Where you’ll find what I saw at Hinton’s announcement for Capsule Networks (CapsNet) and your description seems in the neighborhood of … “Dynamic Routing Between Capsules.” In Advances in neural information processing systems (2017): 3856-3866.

Capsule Networks (CapsNet) are biology inspired alternatives to pooling, where neurons are connected with multiple weights (a vector) instead of just one weight (a scalar). This allows neurons to transfer more information than simply which feature was detected, such as where a feature is in the picture or what colour and orientation it has. The learning process involves a local form of Hebbian learning that values correct predictions of output in the next layer.

Sabour, Sara, Frosst, Nicholas, and Hinton, G. E. “Dynamic Routing Between Capsules.”
In Advances in neural information processing systems (2017): 3856-3866.
Original Paper PDF [1710.09829] Dynamic Routing Between Capsules

Also worth at look Neural Network Zoo Prequel: Cells and Layers | March 31 2017
The Neural Network Zoo shows different types of cells and various layer connectivity styles, but it doesn’t really go into how each cell type works AT Neural Network Zoo Prequel: Cells and Layers - The Asimov Institute

SOURCE : When looking for a similar neural network architecture,
this is a good keyword search : The Asimov Institute Neural Network Zoo new neural network architectures
Top Search Results : Neural Network Zoo - The Asimov Institute

1 Like

The term multi modal neural network is sometimes used to indicate multiple input (of different kinds).

1 Like