I want to handle the Parametric ReLU (PReLU) function in Flux v0.11.1.
I’m trying to implement it in Flux based on the following post, but the version of Flux in this post is old and many of them are not available in v0.11.1.
The following code was created based on the official Flux documentation and forum posts.
struct PReLU_Dense W b α end Flux.trainable(a::PReLU_Dense) = (a.W,a.b,a.α) PReLU_Dense(in::Integer, out::Integer, α) = PReLU_Dense(randn(out, in), randn(out), α) prelu(x, α) = x > 0 ? x : α*x function (m::PReLU_Dense)(x) prelu.(m.W * x .+ m.b, m.α) end Flux.@functor PReLU_Dense m = Chain(PReLU_Dense(2, 4, 0.1), Dense(4,1))
This code works, but the learning doesn’t take place well.
Therefore, we know it is incomplete.
My skill with the Julia language is still in its infancy and I don’t know what to do about it.
Let me know if you have any ideas.