This sounds like the sort of thing you could do via linearization, if the ODE models are time-invariant.
That is, if you have an ODE like dx/dt = f(x), then in the limit of small \Vert x \Vert (for the asymptotics of a decaying solution), you could approximate this by dx/dt = Jx + f(0), where J is the Jacobian f'(0)… Assuming f(0) = 0 (if the system is decaying to zero), then the decay rate is simply given by the eigenvalue of J with the largest real part (presumably negative for decaying solutions).
It’s the same thing if it’s decaying to some non-zero equilibrium x_0, but then you need to first solve the nonlinear equation f(x_0) = 0 to find the equilibrium x_0 of your model, and then linearize around this — i.e. J is now the Jacobian evaluated at x_0.
This is essentially the same thing as linear stability analysis, which you can find described in many textbooks and sources online.