Automatic Differentiation over Vectors

Consider the following expression c=exp(A*t)*b where A is a matrix, b and c are vectors and t is a scalar. I’m trying to teach ForwardDiff to calculate the derivative of c with respect to t, which is nothing but A*c. This is my attempt so far

import ForwardDiff: derivative, Dual, value, partials
import ExponentialUtilities: expv
using SparseArrays

function expv(t::Dual, A, v::Vector; kw...) 
	w = expv(value(t), A, v; kw...)
	return Dual(w, (A*w) .* partials(t))
end

A = sparse([0 1.; 1. 0])
v = [1., 0]

f(t) = expv(t, A, v)

derivative(f, 1.0) 

However, I get the following error, which makes me think that this is not possible at all. Is this the case? Is there any other automatic differentiation tool for doing this?

ERROR: ArgumentError: Cannot create a dual over scalar type Array{Float64,1}. If the type behaves
as a scalar, define FowardDiff.can_dual.

I haven’t tried this, but I believe ForwardDiff.Dual needs to be the elements of the array w. It may be as simple as Dual.(w, …). (But it may also just work without any definitions?)

The various reverse-diff packages (Flux, Zygote, Yota) all have ways to define custom gradients, but the reverse one (I guess dot(Δ,W*c)?)