ANN: NLPModels.jl v0.1.0

Hello everyone,

we’re happy to announce NLPModels.jl v0.1.0, a package providing ways to create Nonlinear Programming Models with a standardized API. This allows the creation of algorithms that can rely on that API to access the objective and constraints functions and their derivatives.

The package provides a few models, and some extensions have already been made:

A simple example (after installing ForwardDiff)

nlp = ADNLPModel(x -> (x[1] - 1)^2 + 100*(x[2] - x[1]^2)^2, [-1.2; 1.0])
x = nlp.meta.x0 # [-1.2; 1.0]
obj(nlp, x) # Returns f(x)
g(x) = grad(nlp, x) # Returns the gradient at x
H(x) = hess(nlp, x) # Returns the lower triangle of the Hessian at x

Check the tutorial for more details, including an implementation of a Steepest Descent method using NLPModels.


Check our other packages too: JuliaSmoothOptimizers

Best,

Abel S. Siqueira

1 Like

How do this differ from nonlinear models made in JuMP? (Not trying to be antagonistic, just curious)

I’ll second the request for a compare and contrast to JuMP.

If I understand correctly, it’s an abstraction layer like MathProgBase but designed more specifically for writing NLP solvers.

Hello again.

Miles is correct, NLPModels is closer to MathProgBase than to JuMP, but we focus on Nonlinear Optimization, and on writing solvers.

The MPB way, from what I saw, is to create the functions a solver requires, and JuMP provides a way for the user to write something connecting to MPB.

The NLPModels way is to write solvers with an AbstractNLPModel as argument, and use the NLPModels API. Furthermore, there are some ways to create NLPModels, so the writer can test his solver on hand-written problems, JuMP-written problems and CUTEst problems.

To finalize, we also have a function called NLPtoMPB than converts a NLPModel to a MPB model. So there is some interchangeability. In this example we solve a CUTEst problem using Ipopt.

From my reading and some discussions of how NLPModels works and differs from MathProgBase, in terms of implementation, MPB specifies an API that all solvers implement and all user-facing modeling layers talk to solvers through. NLPModels also defines a concrete data structure, whereas in MPB each solver is free to implement its own type and store relevant data however it would like (closer to whichever solver API it happens to be wrapping, usually). Correct me if that’s a bad take on it.

Not sure from the MPB side, but I think so. Yes for the NLPModels. I think it would be clearer if we had a native Julia solver using MPB. Here’s an example of a solver using NLPModels.

Whereas JuMP is a modeling language, NLPModels is a model API that has worked well for us in the past and that can be relied upon when writing solvers in pure Julia. Compared to MPB, NLPModels mostly disconnects solver and model. It has a somewhat richer API that allows methods such as objgrad() for problems where computing the objective value and gradient together is more efficient than obtaining each separately. You can also, e.g., add slack variables to any model transparently. That said, the code is still young, so all comments and bug reports are most welcome.

As Abel mentioned, models may be loaded from an AMPL nl file, a CUTEst SIF file, a JuMP model, pure Julia functions, or from Julia functions that realize an interface with some other system. They can also be passed to an MPB-enabled solver.