Dense Layers, softmax, relu

I am running through the 60 minute blitz in flux’s model zoo and was looking for some clarity on functions that I havn’t seen before. I am on line 205 and I am struggling to understand what the Dense function does, as well as what the relu option does. I have not been able to find documentation of functions in flux (?Dense returns that no documentation was found).

I know that this function somehow creates a dense layer in a neural net, and presumably the numbers are the number of columns and rows of the multiplication matrix in this layer? But I have so far had to learn just based on inference regarding what was going on. Can someone briefly explain either where I can find the documentation for functions in packages or alternately what Dense(10,5, relu) (model zoo, 60 minute blitz, line 205) does?


Thanks, I think that makes some sense to me, and certainly I think I understand most of the line at this point, my only question remaining is what does the relu option do here?

It’s the activation function:

Thank you!!!