Comparing ReverseDiff.jl to a similar Python package (autograd), I see in their documentation:
In-place modification of arrays not being differentiated with respect to (for example,
A[i] = x
orA += B
) won’t raise an error, but be careful. It’s easy to accidentally change something without autograd knowing about it. This can be a problem because autograd keeps references to variables used in the forward pass if they will be needed on the reverse pass. Making copies would be too slow.
From what I think I’m reading, on the forward pass the package will track an operation from A
applied to a tracked variable, but on the reverse pass it will only apply what it sees from A
when reading its reference from memory. If untracked instructions modified A
, these will not be reflected when differentiating.
Does this warning apply to ReverseDiff.jl as well? If so, am I correct in how I interpreted this warning?