Transposed convolution aka deconvolution layer in Flux.jl

flux

#1

I would like to use deconvolutional layer (transposed convolution) in my network.
How can this be done in Flux.jl?

To be specific, it is described here: https://stackoverflow.com/questions/35049197/how-does-the-unpooling-and-deconvolution-work-in-deconvnet

To complicate things even further, I would like to enforce sharing of the weights with the corresponding convolutional layer.
That is, I don’t want it to introduce new trainable parameters in deconvolution layer, but instead, just to use the same filters to perform upsampling that were used in downsampling, as mentioned in the accepted answer of the StackOverflow link above