Differentiating optimization problem solutions in Julia

Hello,

I was recently made aware of the paper “Efficient and Modular Implicit Differentiation”:

which is implemented in Jax here:

It is about automatically generating differentiation rules for optimization problem solutions given a function defining the optimality conditions.

I wanted to bring this up to see if it is on anyone’s radar, it sounds like it would be a nice general feature to add to the Julia autodiff ecosystem. We would have many use cases in our application area of tensor networks and quantum computing.

I have a feeling that this functionality may be available in some form in the extensive Julia optimization and differential equation ecosystem, but I’m not so familiar with that part of Julia so if it is available it would be nice to hear about it!

-Matt

2 Likes

That’s an attempt. Though it is slated to exist for GalacticOptim.jl: there’s specifically a spot for parameters in its definition specifically for supporting differentiation, but we haven’t done it yet.

Here you go. https://github.com/JuliaNonconvex/NonconvexUtils.jl#hack-5-implicitfunction

1 Like

Thanks for the pointers, I knew someone must have worked on this kind of thing.

I think ImplicitFunction from NonconvexUtils.jl is what we are looking for, when I get some free time I’ll test it out and see if it works for our use cases.

DiffOpt.jl may be relevant too, but it looks pretty specific to JuMP (though I’m not familiar with the JuMP syntax so I can’t really tell at first glance).

1 Like