Have a look at How to implement embeddings in Flux that aren't tragically slow? - #2 by dhairyagandhi96. Embeddings are an interesting case because they’re trivial to implement as a loop on GPU, but extremely difficult to express as a vectorized computation. If you just want something that works, Transformers.jl/embed.jl at master · chengchingwen/Transformers.jl · GitHub has a working implementation and there’s a PR out to add something like it to NNlib.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Simple NLP in Flux with Embedding Layer | 2 | 1397 | August 25, 2019 | |
| How to implement embeddings in Flux that aren't tragically slow? | 3 | 1590 | April 3, 2021 | |
| Flux GPU Error with Zygote | 2 | 515 | August 23, 2022 | |
| Flux sending model to gpu does not behave well | 2 | 254 | July 13, 2023 | |
| Julia/Flux creating a model correctly - using Chain Embedding layer reshaping & Dense layers | 3 | 844 | February 7, 2023 |