Reshaping outputs between layers in Flux.jl

The input to my network are N x 2 arrays and I’d like to implement a Dense layer, followed by a convolution. This requires reshaping the output of the dense layer. The forward pass seems to work fine, but I think there’s an error on the backprop somewhere. Any help is appreciated.

m = Chain(
    Dense(N, N, relu),
    x -> reshape(x, N, 1, 2, 1),
    Conv((2,1),  2=>1, relu),
    x -> x[:],
    Dense(N-1, N, relu)) |> gpu


m(train[1][1])

loss(x, y) = Flux.mse(m(x), y)

This is the error, some kind of dimension mismatch?:

MethodError: no method matching *(::TrackedArray{…,CuArray{Float32,2}}, ::CuArray{Float32,3})
Closest candidates are:
  *(::Any, ::Any, !Matched::Any, !Matched::Any...) at operators.jl:502

Probably a type mismatch, multiply isn’t currently implemented between types (CuArray, Tracked{CuArray}) is basically what it’s telling us

See the first example here http://fluxml.ai/Flux.jl/stable/training/optimisers.html for tracking a gradient
Making y a flux param will probably work