Alternatives to zero-padding conv layers?

Is there a way to use same-padding or reflection-padding in Flux convolutional layers, such as NNlib.jl supplies in https://github.com/FluxML/NNlib.jl/blob/master/src/padding.jl, rather than only using zero-padding?

I’m seeing significant edge-effects in an image GAN I’m trying to compose and wondered if using one of these other padding regimens would help.

Can you file an issue on NNlib so we can track this? In the meantime, applying the pad_* functions before a conv layer is not too bad of a workaround.

I finally got back to looking at this project a couple of weeks ago. This is the best construction I could come up with which let me use it the same as a standard convolutional layer:

struct rConv
    a
    layer
end

@functor rConv

function rConv(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ = identity;
        init = Flux.glorot_uniform, stride = 1, pad = 0, dilation = 1, groups = 1,
        bias = true) where N
    
    a = Flux.calc_padding(Flux.Conv,pad,k,dilation,stride)
    weight = Flux.convfilter(k, ch; init, groups)

    layer = Flux.Conv(weight, bias, σ; stride=stride, pad=0, dilation=dilation, groups=groups)

    rConv(a,layer)
end

function (m::rConv)(x)
    m.layer(NNlib.pad_reflect(x,m.a))
end

Not sure if it’s the most elegant solution, but it works for me and definitely reduces the edge effects in my specific project.

Edit to say: I’ve not tried using this on a gpu, so not sure if it will work on that.

Glad you figured it out, I don’t see any reason it wouldn’t work on GPU :slight_smile:. We’ve talked about what the interface for fancier padding in Conv might look like, but never arrived at a design. If you have any ideas there, let us know on GitHub!