Typestable caching of multiple arrays with different eltype

Hello,
i was wondering, whether it is possible to code some kind of cache pool, where i can request cache arrays of various types. Each type of cache array should be only created once. I’ve tried

# cache arbitrary arrays of dim N
struct CachePool{N}
    caches::Dict{Tuple{DataType, NTuple{N, Int}}, AbstractArray}
    CachePool(N) = new{N}(Dict{Tuple{DataType, NTuple{N,Int}}, AbstractArray}())
end

# request a cache of specific type and size
# allocate new cache if there is none yet
function getcache(c::CachePool, T, size...)::T
    key = (T, size)
    if haskey(c.caches, key)
        cache = c.caches[key]::T
    else
        cache = T(undef, size...)
        c.caches[key] = cache
    end
    return cache
end

pool = CachePool(2)
@code_warntype getcache(pool, Matrix{Int}, 3, 1_000)

but (kinda as expected) the c.caches[key] is not typestable.

For some context here is the problem i’m trying to solve (maybe there is a much better solution?).
I have a function for differential equations which needs a cache.

struct ODEFun{T}
    _cache::T
end
function (ode::ODEFun)(du, u, p, t)
    f!(ode._cache, u, ...)  # update cache with expensive calculations
    g!(du, ode._cache, ...) # update du with cache
end

This works fine for explicit solvers. However for implicit solvers the types of du and u change to some AD types. I’d love to have a generic version

struct ODEFun{T}
    _cachepool::T
end
function (ode::ODEFun)(du::T, u::T, p, t) where {T}
    cache = getcache(ode._cachepool, T)
    f!(cache, u, ...)  # update cache with expensive calculations
    g!(du, cache, ...) # update du with cache
end

which would work for different AD types as well as CuArrays for example.

Any ideas? :slight_smile:

See @stevengj’s implementation at FAQ suggestion: lazily allocate buffers for use with automatic differentiation · Issue #769 · SciML/DifferentialEquations.jl · GitHub, and see the diffcache discussion at ForwardDiff caches usage . Once JuliaCon videos are done I think it would be good to spawn out a package around this.

2 Likes

Uh thanks, it seems like a simple change to the get! function as in the linked issue did the job. It is much nicer anyway :smiley:

function getcache(c::CachePool, T, size...)::T
    key = (T, size)
    return get!(c.caches, key) do
	   T(undef, size...)
    end::T
end

@code_typed still reports the unstability but the allocation is gone and it is like 10 times faster :thinking:

You’ll get a dynamic dispatch, but the type inference issue won’t “leak” out to the rest of your code. This means you pay a price of about 100ns or so extra. For most models that probably doesn’t matter since it’s about as expensive as a few ^ calculations. Because it’s dynamic, it does mean that some static tooling like JET will have a hard time with it as well, but for the most part it should be simpler without a noticeable performance loss on most models.