Freeze a weight with Flux

I have a few linear layers created with Flux. I ran these layers for a few epochs, and would now to freeze all the weights of one of the linear layers, but leaving the bias unfrozen. Could somebody post a MWE demonstrating the freezing and unfreezing of the weights of a given layer? Thanks.

You can use Optmisers.freeze!, see the docs
https://fluxml.ai/Optimisers.jl/dev/#Frozen-Parameters