Lux initialization

Hello everyone, I apologize for the probably naive question, but if I wanted to initialize the layer parameters with glorot_uniform in Lux with a gain smaller than the default, how do I then obtain the parameters and status to define the optimization problem?

I’d like to do something like this:

approximating_neural_network = Lux.Chain(Lux.Dense(4, 16, tanh; init_weight=Lux.glorot_uniform(rng, 4, 16, gain=0.01), init_bias=Lux.zeros32), Lux.Dense(16, 16, tanh; init_weight=Lux.glorot_uniform(rng, 16, 16, gain=0.01), init_bias=Lux.zeros32), Lux.Dense(16, 1; init_weight=Lux.glorot_uniform(rng, 16, 1, gain=0.01), init_bias=Lux.zeros32))

# p_net, st = Lux.setup(rng, approximating_neural_network)

p_net, st?

You want to pass a function as the init_weight parameter that is then going to be called like weight = init_weight(rng, out_dims, in_dims) from the setup.

This can easily be done by defining your own function calling glorot_uniform with the desired gain

my_glorot_uniform(rng, dims...) = Lux.glorot_uniform(rng, dims...; gain=0.01)

approximating_neural_network = Lux.Chain(
    Lux.Dense(4, 16, tanh; init_weight=my_glorot_uniform, init_bias=Lux.zeros32),
    Lux.Dense(16, 16, tanh; init_weight=my_glorot_uniform, init_bias=Lux.zeros32),
    Lux.Dense(16, 1, tanh; init_weight=my_glorot_uniform, init_bias=Lux.zeros32),
)

p_net, st = Lux.setup(rng, approximating_neural_network)
1 Like

Thank you!