I’ve written a neural network using Knet library and I am trying to use the normalized output of the neural network as a density function. The problem is when I use ReLU as the activation function of the neural net and apply column-wise normalization to the output, Julia throws “isprobvec( p) is not satified” error. ReLU always returns zero or a positive number which isn’t the problem. I cannot understand what the cause of error is.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Turing.jl, differentiation & categorical outputs -- `isprobvec` conundrum | 5 | 749 | February 8, 2021 | |
| Multinomial accepts wrong parameters | 3 | 944 | November 10, 2019 | |
| Error code for GLPK and SigmoidalProgramming | 11 | 593 | May 5, 2023 | |
| Vecnorm Error | 1 | 870 | March 6, 2020 | |
| Incorrect Infeasibility Status While Using MIPVerify | 2 | 298 | January 20, 2023 |