Hello,

I am new to flux and I would appreciate help with implementing custom layer (or maybe custom activation?).

Layer is like Dense layer, some inputs and outputs are exactly two neurons: first with NNlib.softplus activation, second with NNlib.sigmoid activation.

Motivation is to model parameters of negative binomial distribution directly (n, p).

Loss function for this is defined as described here:

Is there any chance someone experinced enough can help me with this?

a] how can I say that each neuron in output layer has different activation

b] construct loss function based on likelyhood estimate as in the artical

Target is discrete variable with negative binomial distribution.

Thanx for anyone for helping me out

Lubo