Flux/Lux - Combining two neural networks

I want to design a neural network whose output is the combination of the outputs of two neural networks; a very simple example would be the output of two Dense layers.

Intuitively, I would want to write something like
model=Dense(2,2)+Dense(2,2)
which throws an error.

An alternative would be to do something like (in Flux)

model_1=Dense(2,2)
model_2=Dense(2,2)
model(x)=model_1(x)+model_2(x)

but then model is not a layer, so I cannot use some API such as Flux.params or Optimisers.destructure (to site just a few).

Is there a way to do it in a more elegant way which would return me an object like Chain or similar?

PS1: I am currently using Lux, but I wrote the example in Flux for convenience. I imagine a solution in one would work in the other with the appropriate modification
PS2: Just to be sure that no one says it, I know that by linearity I could train the model in a single Dense layer. I am trying to do this with more complicated models.

1 Like

I think you want Parallel:

julia> m1 = Dense([1 2; 3 4]);

julia> m2 = Dense([5 6; 7 8]);

julia> x = [9, 10];

julia> m1(x)
2-element Vector{Int64}:
 29
 67

julia> m2(x)
2-element Vector{Int64}:
 105
 143

julia> m3 = Parallel(+, m1, m2)
Parallel(
  +,
  Dense(2 => 2),                        # 6 parameters
  Dense(2 => 2),                        # 6 parameters
)                   # Total: 4 arrays, 12 parameters, 352 bytes.

julia> m3(x)
2-element Vector{Int64}:
 134
 210
6 Likes