Impose initialization adn normalization on layers in Flux

Hi all, I am constructing a simple perceptron with Flux, something like: y = W*x + b. To do so I simply use:

model = Dense(N,1,nonlinearity);

What I do not understand is:

  • How can I impose the initial value for the matrix W and b?
  • How can impose that at every iteration in the learning norm(W) = 1?
    Thanks!

Hi

To set initial values you can do

W = rand(1, N); b = rand(1);
model = Dense(param(W), param(b), nonlinearity);

This is also roughly what happens when you call model = Dense(N,1,nonlinearity);, as you can see by looking at the source with

@edit model = Dense(N,1,nonlinearity);

One option to keep norm(W) = 1 would be to define a custom wrapper, e.g.

struct NormedDense{D}
  l::D
end
function (a::NormedDense)(x::AbstractArray)
  W, b, σ = a.l.W ./ norm(a.l.W), a.l.b, a.l.σ
  σ.(W*x .+ b)
end
model = NormedDense(Dense(param(W), param(b), nonlinearity));
1 Like

Using Julia v1.4 this works:

model = Dense(W,b)