What does the architecture of Flux.RNN look like?

Hi all,
I am confused about the architecture of RNN in Flux.jl. If I define a RNN using the code RNN(3,3), will it be like the left one or the right one in the picture below?


Based on the documentation, I think it is more like the right one, but I am not sure:

Flux.RNN —Function

RNN(in::Integer, out::Integer, σ = tanh)

The most basic recurrent layer; essentially acts as a Dense layer, but with the output fed back into the input each time step.

I actually want to define a multi-input and multi-output RNN like the right one in the figure. If flux.RNN() cannot achieve this, do anyone have ideas how to write it mannually?

Thanks a lot for your help!