How do I implement the Parametric ReLU (PReLU) function in Flux v0.11.1?

Sorry, @RiN, I’m afraid I completely misunderstood initially and led you the wrong way. If you want α to be a learnable parameter, you can’t just use Dense like I suggested. Your initial approach is correct.

The problem in learning seems to be that the Flux.params behavior is to return only mutable fields. See:

julia> struct PReLU_Dense
           W; b; α
       end

julia> Flux.@functor PReLU_Dense

julia> Flux.trainable(PReLU_Dense(1,2,3))
(W = 1, b = 2, α = 3)

julia> Flux.params(PReLU_Dense(1,2,3))
Params([])

When the fields are arrays instead:

julia> Flux.params(PReLU_Dense([1],[2],[3]))
Params([[1], [2], [3]])

Therefore I think the quickest thing to do in your case is to simply make α a length-1 vector so its value is mutable (rather than making the entire layer a mutable struct to accommodate it). You will need to modify the layer application function as, e.g.:

function (m::PReLU_Dense)(x)
    prelu.(m.W * x .+ m.b, m.α[1])
end