From the Training Neural Networks in Hybrid Differential Equations example, I know that SciMLSensitivity is compatible with PresetTimeCallback. I was wondering whether it is possible to compute derivatives (and then optimize) with respect to the preset callback times (`dosetimes`

in the linked example)?

I tried the following but it did not seem to work. It returns `nothing`

instead of the gradient:

```
function f(du,u,p,t)
du[1] = -p[1]*u[1]
end
u0 = [10.0]
p = [1.0]
x = [1.0]
tspan = (0.0, 2.0)
prob = ODEProblem(f,u0,tspan,p)
affect!(integrator) = integrator.u[1] += 10
dose(x) = PresetTimeCallback(x, affect!)
loss1(p) = solve(prob,Tsit5(),u0=u0,p=p,callback=dose(1.0),saveat=0.1, sensealg=ReverseDiffAdjoint())[1,end]
loss2(x) = solve(prob,Tsit5(),u0=u0,p=p,callback=dose(x),saveat=0.1, sensealg=ReverseDiffAdjoint())[1,end]
@show Zygote.gradient(loss1, [1.0]) # works
@show Zygote.gradient(loss2, [1.0]) # returns `nothing`
```

The output is:

```
Zygote.gradient(loss1, [1.0]) = ([-6.385494404700164],)
Zygote.gradient(loss2, [1.0]) = (nothing,)
```