Hi,
I’d like to know whether there is an easy way of getting sub- or super-gradients in Julia (auto-differentiation of non-differentiable functions…?).
Please share related packages Thanks.
Hi,
I’d like to know whether there is an easy way of getting sub- or super-gradients in Julia (auto-differentiation of non-differentiable functions…?).
Please share related packages Thanks.
In general, most autodiff packages in Julia return sub-gradients.
To be honest, I often use Flux.jl for auto-diff but I’m not aware of how it works internally.
Is there any reference where I can check that AD tools provide sub-gradients?
Also, how about super-gradients?
→ oh, well, it would be possible to get super-gradients by flipping the sings of given function and its sub-gradient (a bit uncomfortable though).
Thanks