Error using Enzyme to autodiff wrt Flux neural net parameters

I was able to use set_runtime_activity() within a direct call to Enzyme.jacobian() and differentiate, albeit with the performance hit. Now I’d like to use the Optimization package to train, like so:

# Training loop
adtype = Optimization.AutoEnzyme()
optf = Optimization.OptimizationFunction((p, args)->lossDiff(p, args), adtype)
optprob = Optimization.OptimizationProblem(optf,p, args)
@btime begin
    res = Optimization.solve(optprob, OptimizationOptimisers.Adam(learning_rate), maxiters=max_iters, callback=callback)
end

How can I enable set_runtime_activity in such a way that the Optimization API knows what I want?