I am trying to define a custom type stable layer that will allow me to define activations as Activation(f::Function). I tested whether my custom layer worked with Flux by running test_training(model,x,y)
function, which runs the model with my layer for two iterations and checks whether it trains. The model did train on my machine, but failed to do that on Github Action triggering the error error("Parameters not updating.")
in my testing function. I am now trying to understand why it happened. After some tinkering I found that type unstable variant of the Activation
struct does not trigger the error. Changing opt
from Descent
to ADAM
also allows the model to be trained.