JuMP vs OpenMDAO

Does any of you have experience with both? How do they compare?

I have a lot of experience with OpenMDAO, but very little with JuMP. While there is overlap I’d say the main uses cases are quite different.

OpenMDAO’s biggest strength is efficiently computing total derivatives from provided partial derivatives - with the main use case being nonconvex gradient-based optimization. So for example, you have many components and can provide partials from some or all of the components, OpenMDAO can then compute total derivatives for the system using a direct or adjoint method. This is particular useful if you have nested solvers where it may not be effective or feasible to use straight algorithmic differentiation and propagate dual numbers through the solver. It also has support for bi-directional coloring if you have sparse Jacobians.

I have little experience with JuMP so I may be mischaracterizing, but I’d say its greatest strength is setting up and solving convex optimization problems (although there are wrappers to nonconvex nonlinear solvers also).

Both provide a higher-level interface to allow you to swap out optimizers, but OpenMDAO doesn’t include optimizers specific to convex optimization.