Flux.@functor won't work on custom layer (empty parameters)

I’m having a hard time understanding the behavior of the @functor macro. See the snippets below: For the first one, I follow the tutorial on custom layers step by step and it works. For the second one, I alter the code slightly for my own custom layer and @functor is no longer able to peek into the struct, leading to empty params.

Can anyone share any insights on why the two codes work differently? Thanks in advance.
The tutorial:

using Flux

#Custom Layer from the Tutorial
struct Affine

# Some constructor
Affine(in::Integer, out::Integer) = Affine(randn(out, in), randn(out))
# Overload call, so the object can be used as a function
(m::Affine)(x) = m.W * x .+ m.b
a = Affine(10, 5)
a(rand(10)) # Works: 5-element vector

Flux.@functor Affine 
Flux.params(a) #Works

My code:

#My Custom Layer
struct MyLayer
# Some constructor
MyLayer(a::Real, b::Real) = MyLayer(a,b,-b)  
# Overload call, so the object can be used as a function
(m::MyLayer)(x) = [x[1],m.a*x[2] + m.b*x[1]]
b = MyLayer(1.,2.)
b(rand(2)) # Works: 2-element vector

Flux.@functor MyLayer 
Flux.params(b) # Doesn't work: Params([])

Got these results on a fresh VSCode session, using Flux v.0.13.13.

@functor won’t extract fields that are scalar. You need the parameters to be arrays.

1 Like

Thank you, that’s exactly it.

Functors.jl will, but Optimisers.jl won’t handle them. We tried making it do so a while back, but it turns out that people store a bunch of scalars they don’t want to be trainable in their models. We might consider changing this in a future (very breaking) version, but for now Optimizing scalars · Issue #92 · FluxML/Optimisers.jl · GitHub has further background info.