Flux: Hidden problem broadcasting convolution layer on GPU?

I found the following comment in the source code for the Conv layer for Flux.jl:

function (c::Conv)(x::AbstractArray)
  # TODO: breaks gpu broadcast :(

It’s a bit concerning, given that I am currently working on a project which involves broadcasting a CNN. I could not find a mention of this issue in the documentation.

Does anyone know what is meant by “breaks” (e.g., is it just slower, are the results incorrect, etc.), and what is meant by “gpu broadcast” (e.g., does it refer to broadcasting a Conv layer over an Array, or something else).

Thanks in advance!

All that TODO means is that uncommenting the line immediately below it (# ndims(x) == ndims(c.weight)-1 && return squeezebatch(c(reshape(x, size(x)..., 1)))) broke broadcast at some point. I have my doubts it’s even relevant any more given the blame is from 4 years ago, but it’s presumably persisted since nobody has bothered to investigate whether the line above is still useful. Either way, it should not affect broadcasting the conv layer itself.