How to calculate Cross Entropy loss between y_predict and y_target on GPU

I have not been able to find out how to calculate the loss between each column of a y_predict and y_target matrix as a GPU operation. Please let me know how I can do this.

Thank you,

Jack

Calculating cross-entropy on 2D (and more) arrays is covered in the docs. There is no difference between using GPU and non-GPU arrays for this particular operation.

In future, I’d also recommend using the #domain:ml category so that you get more domain expert eyes on questions. Most people are not subscribed to #usage because of the high volume of messages.

Thank you for letting me know! I did not see anything in the docs which explains why this operation cannot be done in parallel (loss on each column at the same time) with the GPU. Is there a reason why this hasn’t been implemented on the GPU that I am not aware of?

Thank you,

Jack

I’m not sure what exactly you mean by “parallel on each column at the same time”, but it absolutely does make use of GPU parallelism (just like the cross entropy loss function in any other ML framework). The easiest way to find out if it works for your particular use case is just to run it with some GPU arrays—I think you’ll be pleasantly surprised with the results :wink:

1 Like

Thank you for clarifying, from your previous answer I had thought that it was all processed on the CPU but I understand now.