I have learned to use most of the ML packages and I can compute

the outcome of convolutions given kernel, stride pad…, but I don’t understand why or how to do a reshape when using Flux with convolutions and probably other layers so I need to understand this.

Please help.

Thank you

H.Kramer

# Why the reshape in Flux mnist convolution example

**hskramer**#1

**hskramer**#2

Thanks working on this problem myself help me gain a lot insight into

how flux and julia work. The reshape is a Flatten layer taking the 6x6x8xN and turning it into a 288xN for the Dense layer or what many call a fully connected layer.