This is one of those cases where looking at the full stacktrace and not just the error message helps. embedding is a callable struct which represents an embedding layer. Instead of passing this layer object to Chain like the tutorial does, you’re passing the output from calling it on some arbitrary input. Said output is a Matrix, so when Chain goes to call what it thinks is an Embedding layer it ends up invoking output_matrix_from_embedding(x) and predictably fails.
If you’re familiar with how layers work in other ML frameworks, Flux layers work much the same.
Ok - As I am new to both Julia, Flux & Deep Learning I understand that something basic was wrong.
Should I just input the transposed document term index as input to the embedding layer? - how will it be hot encoded? I am clearly missing something…