Alternatives to zero-padding conv layers?

Is there a way to use same-padding or reflection-padding in Flux convolutional layers, such as NNlib.jl supplies in NNlib.jl/padding.jl at master · FluxML/NNlib.jl · GitHub, rather than only using zero-padding?

I’m seeing significant edge-effects in an image GAN I’m trying to compose and wondered if using one of these other padding regimens would help.

Can you file an issue on NNlib so we can track this? In the meantime, applying the pad_* functions before a conv layer is not too bad of a workaround.