Flux: Obtaining individual layer output for DenseNet from Metalhead.jl


I’m training the DenseNet from Metalhead.jl on the CIFAR-10 dataset which generally seems to work quite well. However, I can’t seem to find out how to access the output from individual layers in the Chain.

I know about Flux.activations() but it just gives me an array of size size(Flux.activation(chain, input)[1]) = (1, 1, 1024, 1) which I’m not sure how to interpret. Is there any other way to save the output of an individual layer during a forward pass?

The DenseNet in Metalhead.jl is a large Chain wrapper around two smaller Chains - a backbone and a classifier head. Flux.activations acts on the larger Chain in your example. I think what you might be looking for might be something like Flux.activations(Metalhead.backbone(model), x). You should get a Tuple with the output of each one of the layers.

That seems to be exactly what I was looking for :slight_smile: Thanks!

1 Like