Hello, how is it known the concept in a neural network of a “transversal” or “cross section” layer ? I can’t find anything relevant with these two words…
Its characteristic is that its activation function takes as input all the output from the previous layer and its output dimension size is given by the dimension in output of its activation function.
I implemented a generic one in my own NN Library, to use it for classification (with the softmax activation function) or for pooling the neurons of the previous layer (max, avg,…).
However I can’t find how such kind of layer is known in the literature on in standard NN libraries as Flux or KNET…