I’ve written a neural network using Knet library and I am trying to use the normalized output of the neural network as a density function. The problem is when I use ReLU as the activation function of the neural net and apply column-wise normalization to the output, Julia throws “isprobvec( p) is not satified” error. ReLU always returns zero or a positive number which isn’t the problem. I cannot understand what the cause of error is.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Detection probability | 8 | 1086 | January 26, 2019 | |
Error using example for NeuralODE | 4 | 101 | October 25, 2024 | |
Knet prediction with CNN | 1 | 725 | December 30, 2021 | |
Julia crashes for larger PINNs? | 6 | 343 | February 11, 2024 | |
Knet issue | 3 | 757 | January 22, 2019 |