On the future of Flux.destructure and SciML integration

In this regime it is expensive. While I haven’t checked this example today, my claim above is that using ComponentArrays.jl will also be expensive. I’d say that explicit/implicit is not the right axis here, it’s more like solid-vector vs. nested structure. For sufficiently small networks, converting back & forth may take longer than the matrix multiplications etc.

There is more that could be done here, e.g. Optimisers.jl could add a destructure! which re-uses the same model quite easily. It might save quite a bit here. See e.g. issue 146.

This is the regime where SimpleChains.jl is likely to be much faster. It’s a completely different design, which never uses a nested structure at all. But has various other limitations.

This is a key difference from ComponentArrays.jl, which perhaps we should highlight. (As well as being the main source of coding headaches!) I’m glad if it’s useful to someone.

julia> using ComponentArrays, Optimisers

julia> let twice = [1.0, 2.0]
        cv = ComponentArray(x=twice, y=twice, z=[1.0, 2.0])
        cv.x[1] += 999
        cv  # this has 6 independent scalar parameters
       end
ComponentVector{Float64}(x = [1000.0, 2.0], y = [1.0, 2.0], z = [1.0, 2.0])

julia> let twice = [1.0, 2.0]
        v, re = destructure((x=twice, y=twice, z=[1.0, 2.0]))
        @show v  # only 4 indep parameters
        v[1] += 999
        re(v)
       end
v = [1.0, 2.0, 1.0, 2.0]
(x = [1000.0, 2.0], y = [1000.0, 2.0], z = [1.0, 2.0])