Flux.BatchNorm doesn't accept Int32 as argument

Using Flux 0.12.7, I am trying to use the BatchNorm layer, via the following statement:

Flux.BatchNorm(n_layers)
# I know that n_layers will have type Int32

However, I get the following error:

ERROR: LoadError: MethodError: no method matching BatchNorm(::Int32)
Closest candidates are:
  BatchNorm(::F, ::V, ::V, ::W, ::W, ::N, ::N, ::Bool, ::Bool, ::Union{Nothing, Bool}, ::Int64) where {F, V, N, W} at /home/dknite/.julia/packages/Flux/ZnXxS/src/layers/normalise.jl:235
  BatchNorm(::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any) at deprecated.jl:70
  BatchNorm(::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any) at deprecated.jl:70

How do I fix this?

n_layers seems a misleading argument name, it should be n_channels. Beside that, can’t you simply use an Int. Flux.BatchNorm(3). Or convert it: Flux.BatchNorm(Int(n_layers))

1 Like