Flux: Feed minibatch into Neural Network

I built the following model:

function my_model(n)
    conv_1 = Conv((8,8), 3 => 32, stride=(4,4), relu)
    conv_2 = Conv((4,4), 32 =>64, stride=(2,2), relu)
    conv_3 = Conv((3,3), 64 =>64, stride=(1,1), relu)
    model = Chain(
    x -> x / 255,
    x -> reshape(x, (:, 1)),
    Dense(2304, 512, relu),
    Dense(512, n),
    return model

and tested it in the following way, where the data is stored in (width, height, # channels, # batches) order:

model = my_model(6)
test_input = rand(UInt8, (80, 80, 3, 1))
test_batch = rand(UInt8, (80, 80, 3, 32))

When testing the model with test_batch I get a dimension mismatch error. (DimensionMismatch(“A has dimensions (512,2304) but B has dimensions (73728,1)”)). It seems as if the reshape command does not work for the batch in this way. Actually, I thought I can feed my network a single input as well as a batch when storing the data in WHCN order. Could anybody help me with this?

Dense layers in Flux expect that batches will be ordered as columns, but your model reshapes every tensor to an n×1 matrix. The dense layer after your call to reshape is expecting a 2304×m matrix, though.

You luck out with the dimensions matching for test_input, where you’re giving a 2304×1 matrix to your dense layer. However, the dimensions don’t match with test_batch, where you’re giving a 73728×1 matrix to your dense layer. If you instead use x -> reshape(x, (:, size(x, 4))), you’ll get a matrix with all of your batches along the columns. Your test_batch variable, for example, will get reshaped to a 2304×32 matrix, which the dense layer can accept.