I was recently made aware of the paper “Efficient and Modular Implicit Differentiation”:
which is implemented in Jax here:
It is about automatically generating differentiation rules for optimization problem solutions given a function defining the optimality conditions.
I wanted to bring this up to see if it is on anyone’s radar, it sounds like it would be a nice general feature to add to the Julia autodiff ecosystem. We would have many use cases in our application area of tensor networks and quantum computing.
I have a feeling that this functionality may be available in some form in the extensive Julia optimization and differential equation ecosystem, but I’m not so familiar with that part of Julia so if it is available it would be nice to hear about it!
That’s an attempt. Though it is slated to exist for GalacticOptim.jl: there’s specifically a spot for parameters in its definition specifically for supporting differentiation, but we haven’t done it yet.
Thanks for the pointers, I knew someone must have worked on this kind of thing.
I think ImplicitFunction from NonconvexUtils.jl is what we are looking for, when I get some free time I’ll test it out and see if it works for our use cases.
DiffOpt.jl may be relevant too, but it looks pretty specific to JuMP (though I’m not familiar with the JuMP syntax so I can’t really tell at first glance).
In the sense that you can probably differentiate through a significant number of solvers. However, if your solver is not differentiable (or if that is too slow), then you need implicit differentiation, and I don’t think Optimization.jl supports that
Yup, it’s similar to what you’d do with ImplicitDifferentiation.jl but instead you’d just bake the rules into the package and make it reuse caches and structures already generated for the optimization in order to save memory.