Flux.jl: Changing the value of activations from hidden nodes during backprop

Hi,
I have a research project where I want to change the value of activations during backpropagation. It works like this: First I do the forward pass the traditional way and compute node activations at all layers. Then I compute the cost/loss. Now before applying the gradient update, I want to modify/corrupt the values of the activations of each node such that the gradient update will be done with these new values. So the forward propagation and loss is computed with actual activations and backprop update the weights with these corrupted hidden node activations.

One way to do this is to implement NN from scratch using basic linear algebra and manually keep track of backprop with corrupted values. By my models can be very big and needs to be flexible, so manual tracking is not easy.

Is there a way to do this easily in Flux. Can anybody give me some pointers on how can I change the activation values of nodes inside Flux.jl?
I’m also open to any other frameworks which can make this work easy, not necessarily Flux.

Thanks,
v-i-s-h