JuMP.jl vs MatLab's Optimization Toolbox

I am an undergrad, and I am considering taking an optional course on Optimization Engineering next semester. In my engineering faculty, MatLab is the default programming language, and I have heard that the subject relies heavily on MatLab. I then assume that it must the Optimization Toolbox that will be used a lot. On the Mathworks page for the Optimization Toolbox, the list of features includes

The toolbox includes solvers for linear programming (LP), mixed-integer linear programming (MILP), quadratic programming (QP), second-order cone programming (SOCP), nonlinear programming (NLP), constrained linear least squares, nonlinear least squares, and nonlinear equations.

The Julia alternative seems to be JuMP.jl, with a very similar feature-list:

JuMP makes it easy to formulate and solve linear programming, semidefinite programming, integer programming, convex optimization, constrained nonlinear optimization, and related classes of optimization problems.

The lists seem very similar, but are clearly not identical. If you have experience with both the julia package JuMP.jl AND the Optimization Toolbox in MatLab, could you contrast them? Do you think that following the course using Julia will work fine?

For context, I have followed MatLab-based coursed through the Julia-equivalent packages in the domains of Control Sytems, and Digital Signal Processing, and found it to be no real obstacle.

You will not go wrong with Julia (JuMP.jl and Optim.jl). This is where I spend most of time now after using Matlab and GAMS.

From what I recall Tomlab in Matlab comes closer to JuMP.jl in that it tries to bring algebraic formulation to Matlab. But I found GAMS (and now JuMP) easier to use for algebraic formulations.

If your problems in Matlab rely on fmincon/fminsearch. Then look at Optim.jl. That would be the closest in problem formulation style. In fact, you copy past code, change solvers, take care of indexing, and notation for parameters to search over, and the code will work—but learn to code the right way in Julia first.

7 Likes

I also think that you can hardly miss anything optimization-related in Julia. I would just recommend adding Convex.jl to the (already recommended) JuMP.jl and Optim.jl packages. Note that Convex.jl has its Matlab counterpart too – CVX toolbox.

By the way, algorithmic (aka automatic) differentiation has only been added to Optimization Toolbox for Matlab recently (in the R2020b release). Funny, because I can turn your question the other way around: now you can do in Matlab what has been possible in Julia for quite some time :slight_smile:

8 Likes

Don’t worry. You can do it with JuMP and MOI. I am working my works from matlab to Julia.

Just do it with Julia, and you will find intersting things than MATLAB.

2 Likes