I’ve written a neural network using Knet library and I am trying to use the normalized output of the neural network as a density function. The problem is when I use ReLU as the activation function of the neural net and apply column-wise normalization to the output, Julia throws “isprobvec( p) is not satified” error. ReLU always returns zero or a positive number which isn’t the problem. I cannot understand what the cause of error is.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Turing.jl, differentiation & categorical outputs -- `isprobvec` conundrum | 5 | 748 | February 8, 2021 | |
| Multinomial accepts wrong parameters | 3 | 941 | November 10, 2019 | |
| Knet does not pass all tests | 2 | 735 | October 9, 2018 | |
| User define functions from Python and nonlinear constraint JuMP | 0 | 912 | July 25, 2017 | |
| Vecnorm Error | 1 | 866 | March 6, 2020 |