How to model a multiobjective program

In the following code, of which I do not report a long series of complicated constraints, I would like to be able to specify in the objective function also the fact that, together with the Min, sum(x[2:nr,:,"G/\\N"]), I want the Max, sum(x[1,:,"G/\\N"]).
It’s possible?
How can it be done?

Could something like this work?
Min, sum(x[2:nr,:,"G/\\N"])+1/sum(x[1,:,"G/\\N"])

But I would like to know how it is possible in general to set up an objective function that has many conditions on different groups of variables.

m, M = getdata(path,H[1],H[2],H[3]) # m Matrix, M Dict

using JuMP
import HiGHS

model = Model(HiGHS.Optimizer)
# set_silent(model)
@variable(model, x[i in 1:nr, j in 1:ng, k in ["G", "N", "G/\\N", "G\\/N"]], Bin)
@objective(model, Min, sum(x[2:nr,:,"G/\\N"]))
@constraints(model, begin
    # total of G and N for row
    opcom[k in ["G", "N"], r in 1:nr], sum(x[r,:,k]) == M[k][r]
    # each day must have exactly one G (k==1) and one N (k==2)
    daycom[k in ["G", "N"], g in 1:ng], sum(x[:,g,k]) == 1
...
...
end)


function init_model(m)
    fixm(i,j, k,b)= begin fix(x[i, j, k], b; force = true); m[i,j]=replace(m[i,j], r"[N|G]"=>"") end
    for i in 1:nr, j in 1:ng
        m[i, j] == "xx"  ? fixm.(i, j, ["G", "N"], 0)       :
        m[i, j] == "xg"  ? fixm(i, j, "G", 0)               :
        m[i, j] == "xn"  ? fixm(i, j, "N", 0)               :
        m[i,j]∈["G","N"] ? fixm(i, j, m[i,j], 1)            :    
        m[i, j] == "xnG" ? fixm.(i, j, ["N","G"], [0,1])    :
        m[i, j] == "xgN" ? fixm.(i, j, ["G","N"], [0,1])    :
        m[i, j] == "GN"  ? fixm(i, j, "G/\\N", 1)           :
        nothing
    end
end

init_model(m)

optimize!(model)

To clarify, do you want to minimize sum(x[2:nr,:,"G/\\N"]) and maximize sum(x[1,:,"G/\\N"])?

The first step is to convert everything to minimization or maximization, which you can do by multiplying maximization objectives by -1 to create minimize -sum(x[1,:,"G/\\N"]).

Then you need to decide a trade-off: do you want to optimize Min, sum(x[2:nr,:,"G/\\N"]) - sum(x[1,:,"G/\\N"]), or do you want to solve a bi-objective problem.

See

1 Like

Great question.
I hadn’t thought about the situation enough (It’s a real problem of resource allocation, subject to a myriad of constraints. The requirements were posed to me by my wife in colloquial language and I try to interpret them as best I can).
I have to think about it carefully, but I think the multi-objective algorithm is the most suitable one.
You could therefore change the title of the discussion to make it more relevant.
I ask you again, if you can, to explain to me in a (possibly) simple way the logic of the algorithm.
That is, if you want to simultaneously minimize two functions, how do you regulate, for example, when you have to “sacrifice” a bit of the best of one to “improve” the other?

1 Like

Follow the tutorials. The MOA package implements a number of algorithms that can do the trade off and return a set of candidate solutions. Picking the “best” is up to you. There is no right answer.

1 Like

Thanks again for the prompt and comprehensive answers!

I did some tests and found solutions that were preferable to those obtained with the single objective algorithm.
I would like to have clarified a curiosity (among many) that arise when reading the examples in the tutorial.
It concerns the algorithms that can be used in the set_attributes() function.
For example the following:

set_attribute(model, MOA.Algorithm(), MOA.EpsilonConstraint())

set_attribute(model, MOA.Algorithm(), MOA.Lexicographic())

What are the specifics? When is one better than another?
Where can I possibly read explanations?

When is one better than another?

There is no “right” answer, it depends on what you want.

Where can I possibly read explanations?

The docstring of each algorithm has some more details, although not much:

We could expand.

The main reason I didn’t write more is that if you Google ‘lexicographic multiobjective’ and ‘epsilon-constraint method multiobjective’ you’ll find quite a large literature on these methods.