I finally got back to looking at this project a couple of weeks ago. This is the best construction I could come up with which let me use it the same as a standard convolutional layer:
struct rConv
a
layer
end
@functor rConv
function rConv(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ = identity;
init = Flux.glorot_uniform, stride = 1, pad = 0, dilation = 1, groups = 1,
bias = true) where N
a = Flux.calc_padding(Flux.Conv,pad,k,dilation,stride)
weight = Flux.convfilter(k, ch; init, groups)
layer = Flux.Conv(weight, bias, σ; stride=stride, pad=0, dilation=dilation, groups=groups)
rConv(a,layer)
end
function (m::rConv)(x)
m.layer(NNlib.pad_reflect(x,m.a))
end
Not sure if it’s the most elegant solution, but it works for me and definitely reduces the edge effects in my specific project.
Edit to say: I’ve not tried using this on a gpu, so not sure if it will work on that.
Glad you figured it out, I don’t see any reason it wouldn’t work on GPU . We’ve talked about what the interface for fancier padding in Conv might look like, but never arrived at a design. If you have any ideas there, let us know on GitHub!