Most deep learning libraries rely on the same set of primitive functions such as matrix multiplication, element-wise operations, activation functions etc., and convolutions. Also, most of the code should run either on GPU (preferably) or CPU. Matrix multiplication and element-wise functions are already pretty well supported on both - core Julia
GPUArrays. But what about image convolutions?
For GPU the best option so far seems to be CUDNN.jl. However, it doesn’t look very popular: it depends on CUDArt.jl which is “phased out”, Knet.jl calls cuDNN directly and Flux.jl doesn’t mention convolution at all. (There’s also MXNet.jl, but I believe it just wraps C code).
For CPU the only convolution implementation I found is a built-in
conv2 function which is not quite the same thing.
Is there anybody working on something like this? Are there any discussions I’ve missed?
Currently I have some time to work on it, but I guess I’m not the only / the first interested in the topic, so it would be great to synchronize vision before starting.