Best libraries for experimenting with small scale models?

I’m looking for tools to do ML experiments in Julia. Specifically, I’d like trying to design some small-ish models (hundreds to thousands of parameters) and experiment with different approaches that aren’t just differently ordered chains of the usual NN layers. Basically what I’d like is a framework that provides me with:

  • automatic differentiation
  • standard optimization algorithms
  • loss functions

and very little more, leaving otherwise almost full freedom to write your model as any function. Also, running on CPU is perfectly fine. Is there something that fits these requirements? Lux.jl seemed like a possible choice but I haven’t dug deep into it.

Lux.jl is certainly a good option. Depending on what you need exactly, you may also be fine with just using DifferentiationInterface.jl (for autodiff), any optimization package (Optimisers.jl, Optim.jl, Optimization.jl), and custom loss functions or the losses in Flux.jl, Lux.jl or LossFunctions.jl

1 Like