You can define an equivalent MLJFlux chain builder (that works with images of any size) with:
import MLJFlux
using Flux
builder = MLJFlux.@builder Chain(
Flux.flatten,
Dense(prod(n_in)*n_channels => 128, relu),
Dropout(0.2),
Dense(128, 10),
)
Just change the builder in this notebook to the one above to use with MNIST images.
Note that ImageClassifier automatically adds a softmax at the end, to get probabilities. If you don’t want that final layer, set finaliser=identity in the ImageClassifier constructor.