HiGHS v1.9. supports multiple objectives by either blending using weights, or performing lexicographic optimization. I couldn’t find how to implement it using JuMP. Is it yet possible or not?
Hi @qwedsad welcome to the forum
I would recommend that you use MultiObjectiveAlgorithms.jl instead. It is more flexible than the algorithm built into HiGHS and it supports more ways of solving the problem.
See the JuMP documentation
Simple multi-objective examples · JuMP
Hi,
Thank you a lot from the insightful answer!
I implemented my problem with MOA now. I simply need lexicographical optimization for two objectives. However, if I understood correctly, MOA seems to first run the program w.r.t. both objectives separately to get the ideal bounds for the objectives, and only then run the lexicographic optimization. I think this behaviour practically doubles the running time in my case. Is there a way to skip the “ideal bound search” completely?
Another question: Is it possible to give the solution obtained by solving the problem w.r.t. first objective as an initial solution when solving the probelm w.r.t. to second objective? I believe this would reduce some needless search in the second phase. Should I just code the whole procedure myself if I want this functionality?
Is there a way to skip the “ideal bound search” completely?
No, but this is a good feature request. I’ll open an issue: Add option to skip ideal point calculation · Issue #95 · jump-dev/MultiObjectiveAlgorithms.jl · GitHub
I simply need lexicographical optimization for two objectives
If you care only about the lexicographic solution of the objectives in the order they are given, then do:
set_attribute(model, MOA.LexicographicAllPermutations(), false)
Is it possible to give the solution obtained by solving the problem w.r.t. first objective as an initial solution when solving the problem
We already do this. It’s up to HiGHS to determine if it can efficiently restart with the provided solution.
Just to follow up: the upcoming release of MOA v1.4.0 (New version: MultiObjectiveAlgorithms v1.4.0 by JuliaRegistrator · Pull Request #127085 · JuliaRegistries/General · GitHub) adds support for set_attribute(model, MOA.ComputeIdealPoint(), false)
.
julia> using JuMP, HiGHS
julia> import MultiObjectiveAlgorithms as MOA
julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer))
A JuMP Model
├ solver: MOA[algorithm=MultiObjectiveAlgorithms.Lexicographic, optimizer=HiGHS]
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> set_attribute(model, MOA.Algorithm(), MOA.Lexicographic())
julia> set_attribute(model, MOA.LexicographicAllPermutations(), false)
julia> set_attribute(model, MOA.ComputeIdealPoint(), false)
julia> @variable(model, x1 >= 0)
x1
julia> @variable(model, 0 <= x2 <= 3)
x2
julia> @objective(model, Min, [3x1 + x2, -x1 - 2x2])
2-element Vector{AffExpr}:
3 x1 + x2
-x1 - 2 x2
julia> @constraint(model, 3x1 - x2 <= 6)
3 x1 - x2 ≤ 6
julia> optimize!(model)
Running HiGHS 1.9.0 (git hash: 66f735e60): Copyright (c) 2024 HiGHS under MIT licence terms
Coefficient ranges:
Matrix [1e+00, 3e+00]
Cost [1e+00, 3e+00]
Bound [3e+00, 3e+00]
RHS [6e+00, 6e+00]
Presolving model
0 rows, 0 cols, 0 nonzeros 0s
0 rows, 0 cols, 0 nonzeros 0s
Presolve : Reductions: rows 0(-1); columns 0(-2); elements 0(-2) - Reduced to empty
Solving the original LP from the solution after postsolve
Model status : Optimal
Objective value : 0.0000000000e+00
Relative P-D gap : 0.0000000000e+00
HiGHS run time : 0.00
Coefficient ranges:
Matrix [1e+00, 3e+00]
Cost [1e+00, 2e+00]
Bound [3e+00, 3e+00]
RHS [6e+00, 6e+00]
Solving LP without presolve, or with basis, or unconstrained
Using EKK dual simplex solver - serial
Iteration Objective Infeasibilities num(sum)
0 -9.9999663031e-01 Ph1: 2(6); Du: 1(0.999997) 0s
2 0.0000000000e+00 Pr: 0(0) 0s
Model status : Optimal
Simplex iterations: 2
Objective value : 0.0000000000e+00
Relative P-D gap : 0.0000000000e+00
HiGHS run time : 0.00