[ANN]: `StandaloneIpopt.jl`: Another option for using Ipopt in Julia

Hey all—just wanted to quickly announce a new package in the official registries called StandaloneIpopt.jl. As the name suggests, it is a lightweight package that provides two functions: ipopt_optimize and ipopt_nlsolve.

Here is a simple example to show the optimization syntax:

using StandaloneIpopt

# hs071
# min  x1*x4*(x1+x2+x3)+x3
# s.t. x1*x2*x3*x4 >= 25
#      x1^2 + x2^2 + x3^2 + x4^2 == 40
#      1 <= x1, x2, x3, x4 <= 5

# Objective:
obj(x) = x[1]*x[4] * (x[1]+x[2]+x[3]) + x[3]

# constraints:
con1 = [25.0, 1e22] => (x -> prod(x))
con2 = [40.0, 40.0] => (x -> sum(abs2, x))

# box values, the same for all parameters. If they weren't, you could instead
# pass in box_lower = [l_1, l_2, ..., l_n], box_upper = [u_1, u_2, ..., u_n]. 
(b_l, b_u) = (1.0, 5.0)

# init value:
ini = [1.0, 5.0, 5.0, 1.0]

# result:
res = ipopt_optimize(obj, ini, Constraints((con1, con2)),
                     box_lower=b_l, box_upper=b_u)

And a simple example of the nonlinear equation solver syntax, which in this case just solves a quadratic problem for where the gradient is zero. This literal use case is obviously not the best or the right way to solve this problem—this is just a syntax demo.

using StandaloneIpopt, LinearAlgebra, ForwardDiff

const M = Symmetric([exp(-abs(x-y)) for x in 1:10, y in 1:10])
const c = [cos(x) for x in 1:10]
obj(v)  = dot(v, M, v) + dot(v, c)
objg(buf, v) = ForwardDiff.gradient!(buf, obj, v) # in-place! 

res_nl  = ipopt_nlsolve(objg, normalize(ones(length(c))), print_level=0)

Here’s a quick run-down of notable features:

  • Mixed-mode AD that uses ReverseDiff for gradients and ForwardDiff over ReverseDiff for Hessians. It uses a kind of strange wrapper struct that I created which compiles and stores tapes for all input types, so that when you compute the Hessian it actually looks up a compiled tape for the arguments of type ForwardDiff.Dual{T,V,N}(...). It always compiles the tapes, and so even if your objective function is sloppy with intermediate allocations and stuff you’ll get nice fast derivatives. On the flip side, it always compiles the tapes, so if the tape compiling is not possible for your problem this feature is probably not going to be very helpful. Set with the kwarg mixed_ad=true.

  • Sparse constraint Jacobian support via SparseDiffTools.jl. Simply pass in a sparse matrix of Bools as the jac_sparsity kwarg.

  • Nonlinear equation solving using the trick of providing a dummy objective function and setting f_j(x) = 0 for j \in 1,...,N as constraints. It seems a bit weird at first and it’s a trick I first saw in the KNITRO docs. But I’ve used it a lot and it works pretty well.

  • Convenient options for passing in constraints using either Constraints( ([l_1,u_1] => con_1_fun, [l_2, u_2] => con_2_fun, ...)) or with Constraints(nconstraints::Int64, vector_constraint_fun, lower_bounds, upper_bounds). Sometimes of course passing in individual scalar components of the constraints can make it harder to efficiently re-use work calculated for other components. But sometimes that’s not an issue, and it’s convenient to pass them in individually.

So yeah—I hope that this will be useful to some people besides me. I think some of these tools are not really available in other packages, particularly the mixed-mode AD and nlsolving with as powerful of a tool as Ipopt, so I’m hoping it will be. Shout out of course to the beautiful JuMP ecosystem and its maintainers like @odow and the NonConvex.jl package and its maintainers like @mohamed82008. Those are both great tools and packages and great choices. I just wanted something a bit lighter and with a couple bells and whistles that aren’t in the others.


Do you know why the link to your package from JuliaHub is broken?


Huh. Well that’s a bit of a pickle. I do not know why the JuliaHub thing is broken. I have another package that is hosted on sourcehut and in the official registries and that package’s JuliaHub page works fine. So maybe I’ll just give it a few days and if it’s still not working I’ll look into it more.

1 Like

Do any of the people at Julia/JuliaComputing know why most recent new package links are broken?


Been this way past few weeks

1 Like