How careful must I be doing setindex! on untracked arrays in ReverseDiff.jl?

Comparing ReverseDiff.jl to a similar Python package (autograd), I see in their documentation:

In-place modification of arrays not being differentiated with respect to (for example, A[i] = x or A += B) won’t raise an error, but be careful. It’s easy to accidentally change something without autograd knowing about it. This can be a problem because autograd keeps references to variables used in the forward pass if they will be needed on the reverse pass. Making copies would be too slow.

From what I think I’m reading, on the forward pass the package will track an operation from A applied to a tracked variable, but on the reverse pass it will only apply what it sees from A when reading its reference from memory. If untracked instructions modified A, these will not be reflected when differentiating.

Does this warning apply to ReverseDiff.jl as well? If so, am I correct in how I interpreted this warning?

As a side note, there’s also AutoGrad.jl which is likely to be closer to Python’s autograd. This doesn’t mean you should prefer one to another though.