Inplace implementation for neural networks in Lux.jl

Hi everyone,
I want to know if using an in-place version of the Lux layers is better. I noticed a related pull request https://github.com/LuxDL/Lux.jl/pull/463, but I don’t know why this pull request was canceled.

Since the memory usage(GPU memory) keeps increasing when I train a Lux model, I think the in-place layer will reduce the memory allocation.

Inplace versions would require a massive rewrite and won’t really be that advantageous without other optimizations. Instead checkout Reactant + Lux (Compiling Lux Models using Reactant.jl | Lux.jl Docs), that will be significantly faster and will pre-allocate required vram memory once compiled

2 Likes