How to freeze a single weight of a layer?

The code above still removes an entire parameter. The easiest way to freeze a single connection/weight is to zero or mask out the corresponding index in the gradient array:

...

for epoch in 1:epochs
    for (x, y) in data
        gr = gradient(p) do 
            loss(x, y)
        end

        # remove offending gradient
        gr[model[1].W][2] = 0
        Flux.update!(opt, p, gr)
    end
end
2 Likes