Best practice to combine Flux.jl and Convex.jl

It looks like the issue here is that Flux is differentiating through the Convex.jl solve to do backpropagation. Convex.jl’s problems are not differentiable yet. There is work in the Julia ecosystem to differentiate through optimization problems (DiffOpt.jl) but I don’t think it’s quite ready yet and isn’t hooked up to Convex.jl yet either.

Python’s cvxpylayers is more mature from what I’ve heard, so you might be able to PyCall out to use cvxpy in your loss function, but it will probably require a custom derivative rule to connect it to Julia’s autodiff ecosystem.

2 Likes