Why the reshape in Flux mnist convolution example

question

#1

I have learned to use most of the ML packages and I can compute
the outcome of convolutions given kernel, stride pad…, but I don’t understand why or how to do a reshape when using Flux with convolutions and probably other layers so I need to understand this.
Please help.
Thank you
H.Kramer


#2

Thanks working on this problem myself help me gain a lot insight into
how flux and julia work. The reshape is a Flatten layer taking the 6x6x8xN and turning it into a 288xN for the Dense layer or what many call a fully connected layer.