I’ve written a neural network using Knet library and I am trying to use the normalized output of the neural network as a density function. The problem is when I use ReLU as the activation function of the neural net and apply column-wise normalization to the output, Julia throws “isprobvec( p) is not satified” error. ReLU always returns zero or a positive number which isn’t the problem. I cannot understand what the cause of error is.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Turing.jl, differentiation & categorical outputs -- `isprobvec` conundrum | 5 | 745 | February 8, 2021 | |
Multinomial accepts wrong parameters | 3 | 939 | November 10, 2019 | |
Knet does not pass all tests | 2 | 732 | October 9, 2018 | |
User define functions from Python and nonlinear constraint JuMP | 0 | 910 | July 25, 2017 | |
Vecnorm Error | 1 | 863 | March 6, 2020 |