I want to design a neural network whose output is the combination of the outputs of two neural networks; a very simple example would be the output of two Dense layers.
Intuitively, I would want to write something like
model=Dense(2,2)+Dense(2,2)
which throws an error.
An alternative would be to do something like (in Flux)
model_1=Dense(2,2)
model_2=Dense(2,2)
model(x)=model_1(x)+model_2(x)
but then model
is not a layer, so I cannot use some API such as Flux.params or Optimisers.destructure (to site just a few).
Is there a way to do it in a more elegant way which would return me an object like Chain
or similar?
PS1: I am currently using Lux, but I wrote the example in Flux for convenience. I imagine a solution in one would work in the other with the appropriate modification
PS2: Just to be sure that no one says it, I know that by linearity I could train the model in a single Dense
layer. I am trying to do this with more complicated models.