I’ve written a neural network using Knet library and I am trying to use the normalized output of the neural network as a density function. The problem is when I use ReLU as the activation function of the neural net and apply column-wise normalization to the output, Julia throws “isprobvec( p) is not satified” error. ReLU always returns zero or a positive number which isn’t the problem. I cannot understand what the cause of error is.