Does Knet support nested gradient for conv?

I need to take the grad of grad of an NN (this is needed for WGAN with grad penalty regularizations). I just found out that Flux (0.9 release) does not support it:

Nested AD not defined for conv

Stacktrace:
...

I am curious if Knet @denizyuret support nested AD for all network layers. I know this is supported in PyTorch.

Unfortunately not. I recently implemented all the missing pieces for MLP nested gradients. I need to figure out the second gradients for conv to add support.