Transposed convolution aka deconvolution layer in Flux.jl

I would like to use deconvolutional layer (transposed convolution) in my network.
How can this be done in Flux.jl?

To be specific, it is described here: https://stackoverflow.com/questions/35049197/how-does-the-unpooling-and-deconvolution-work-in-deconvnet

To complicate things even further, I would like to enforce sharing of the weights with the corresponding convolutional layer.
That is, I don’t want it to introduce new trainable parameters in deconvolution layer, but instead, just to use the same filters to perform upsampling that were used in downsampling, as mentioned in the accepted answer of the StackOverflow link above

4 Likes

Here is an example:

The tricky part is that Flux convolutional filters always include bias as a default, but that doesn’t work when you want to do weight sharing back and forth, so you have to manually disable them as I have done below.

tmp=Conv((3,3),3=>16, stride=1, pad=1)
a=Conv(tmp.weight,zeros(eltype(Tracker.data(tmp.weight)),size(tmp.weight,4)),stride=1,pad=1)
b=ConvTranspose(tmp.weight,zeros(eltype(Tracker.data(tmp.weight)),size(tmp.weight,3)),stride=1,pad=1)

y=a(X[:,:,:,1:10]) #X is CIFAR 10 images (32x32x3x nex)
x=b(y)

1 Like