Take gradient of parameters not working

Hi all! I am trying to construct a loss function in Flux (v0.11.2) that regularizes the parameters relative to some number (not 0). Below is a minimal working example, does anybody know why the gradients are nothing?

using Flux

m = Chain(Dense(2, 10, relu), Dense(10, 10, relu), Dense(10, 1))

ps = Flux.params(m)

function project_norm() # basically trying to do sum(||x - p|| for all p in params) given a specific x
   t = 0.0f0
    
    for k=1:length(ps)
        for j=1:length(ps[k])
            t += (1.0f0 - ps[k][j])^2
        end
    end
    
    return t
end

gs = gradient(ps) do
    project_norm()
end

gs[ps[1]] # returns nothing?