I’m searching for parameters u to maximize a function obj(u). This objective function, however, depends on the derivatives of another function y(u,p), with regards to the parameters p.

An example:

```
abstract type S1 end
using ForwardDiff:Dual
function y(u,p1,p2)
p1_sen = Dual{S1}(p1,1.0,0.0)
p2_sen = Dual{S1}(p1,0.0,1.0)
n_u = size(u,2)
y = zeros(typeof(p1_sen*u[1,1]),n_u)
y[1] = 0.0
for i in 2:n_u
y[i] = y[i-1] + p1_sen*p2_sen*u[1,i] + p1_sen^2*u[2,i]
end
y
end
function obj(u)
p1 = 2.0
p2 = 3.0
sensitivities = y(u,p1,p2)
sensitivities[end].partials[1]*sensitivities[end].partials[2]
end
u = ones(2,100)
obj(u)
```

For efficient optimization I would like to calculate the gradient of obj(u) (preferably with reverse mode AD), and I’m not sure on how to do this for objectives already using other automatic differentiation tools.

I could calculate the Hessian of y(u,p), but this would be wasteful as not all second order derivatives of y are needed, only the mixed ones between u and p. In general there are many more parameters u than p.

Any advice would be appreciated.