I’m working on my master thesis on Multi-Objective Optimization (MOO) and I found JuMP to be a clean and elegant framework. However, JuMP does not currently support Multi-Objective optimization. Are you planning to support it in the future?
Of course, I do know I could instead establish a preference articulation on my objectives and formulate my problem as a Single-Objective Optimization problem. However, I am more interested in providing the users a clear picture of the trade-offs involved in the optimization problem itself, rather than on providing a single solution. I’m aware of MultiJuMP but it does not satisfy my needs as I have not been able to solve the DLTZ1 problem.
Additionally, the lack of documentation on MultiJuMP also makes it a bit difficult to perceive if I am doing something wrong or not.
using JuMP, MultiJuMP
using NLopt
m = MultiModel(solver = NLoptSolver(algorithm=:GN_CRS2_LM, maxeval=100))
f(z...) = 2 + sum([(z[i]-0.5)^2 - cos(20 * 3.14 *(z[i]-0.5)) for i in 1:length(z)])
JuMP.register(m, :f, 2, f; autodiff=true)
@variable(m, 0 <= x <= 1)
@variable(m, 0 <= y <= 1)
@NLexpression(m, f2, 0.5*(1-x)*(1+f(x, y)))
@NLexpression(m, f1, 0.5*x*(1+f(x, y)))
obj1 = SingleObjective(f1, sense = :Min)
obj2 = SingleObjective(f2, sense = :Min)
md = getMultiData(m)
md.objectives = [obj1,obj2]
md.pointsperdim = 10
solve(m, method = :NBI)
Also, are you planning on supporting derivative-free optimization? NLopt supports derivative-free optimization, however, JuMP currently claims that such optimization is not supported. And although I’m aware of BlackBoxOptim, they only provide a unique multi-objective optimization algorithm…