Integer Optimization: performance help

Hello,

I am working with an integer optimization problem and I am looking for ways to speed the code up. The math problem is the following

I am sampling K different outcomes of the random variable \omega^{k} which in the example is msimul and solving

\max_{I(\omega^{k})\forall k} \frac{1}{K} \sum_{k=1}^{K} F(I(\omega^{k}),\omega^{k})

subject to

I(\omega^{1})=I(\omega^{2})=I(\omega^{3})=\cdots=I(\omega^{K})

Find below a MWE

using JuMP, BenchmarkTools, Gurobi, Random, Distributions

S = 100 

J = 77 

Ι = 150  #this is an Iota \Iota #

Random.seed!(12)

σ = 3

ϕ = 1

κ = 0.95 

mtemp = rand(Normal(0,1),Ι,J,S)

msimul = exp.(mtemp)

β = ones(J)

F=rand(Ι)*100000

function problem(msimul,σ,F,J,Ι,κ,ϕ,β)

        m = (msimul).^(1-σ)

        model = Model(Gurobi.Optimizer)

        K = copy(S)

        @variable(model, Ipost[1:Ι,1:J,1:K], Bin)

        @variable(model, Itrue[1:Ι], Bin)

    

        set_silent(model)

    

        # Objective Function

    

        println("Objective")

        @objective(model, 

                    Max, 

                    ϕ/K*sum(β[j]*(m[:,j,k]'*Ipost[:,j,k])   for j=1:J  for k=1:K) - F'*Itrue )

        println("Constraints")

        # Constraint 1 and 2 - non-anticipativity constraint and ex-post choice

        for k=1:K

            for j=1:J

            @constraint(model, Ipost[:,j,k]'*Ipost[:,j,k] == 1)

            @constraint(model, sum(Ipost[:,j,k]'*Itrue ) == 1)

            end

        end

    

        println("Optimize")

        optimize!(model)

    

        Ipost =  value.(Ipost)

        Ipost = trunc.(Int,Ipost)

        Iopt =  value.(Itrue)

    

        return trunc.(Int,Iopt), objective_value(model)

end 

solution = @btime problem(msimul,σ,F,J,Ι,κ,ϕ,β)

I am looking for ways to speed the code up

The JuMP construction or the problem solve? It looks like you’re solving the extensive formulation here, which grows large as you scale. You might want to look into something like progressive hedging to decompose this problem.

There are a few related packages:

Thanks so much! I am looking for ways to improve in both fronts. Basically, for each realization of \omega I am choosing the I(k) that maximizes the sum and then I am making the choices to be the same across simulations k by using a non-anticipativity constraint.

Perhaps I am picking this maximum in an inefficient way?

or perhaps there is a better way to enforce the constraints?

I will definitely take a look at the packages you are suggesting.

I don’t really understand what your nonanticipativity constraints are doing, or why they are quadratics.

You could consider doing this binary * binary to linear reformulation:

Thanks,

Basically the non-anticipativity constraint is telling Julia to choose the same vector in every state of the world k. This is the second constraint.

The first constraint is saying, for each state of the world k I can choose one thing in each j.

I thought that doing these as dot products could save some time, hence the quadratics.

I thought that doing these as dot products could save some time

You need to reconsider your formulation. I don’t fully understand, but perhaps something like:

# For each j, k, choose 1 i
@constraint(model, [k=1:K, j=1:J], sum(Ipost[:, j, k]) == 1)
# Nonanticipativity
@constraint(model, [k=1:K, j=1:J], Ipost[:, j, k] .== Itrue)

But this still isn’t right. Because why do you have Ipost if just has to be Itrue? Why not just put Itrue everywhere?

Rather than try to improve your code, you need to step back and formulate the problem on paper first.

@odow , thanks so much. Your suggestion greatly improved performance (\approx 30x) with this small correction in the second constraint. I am not asking for the whole vector Ipost[:,j,k] to be equal to Itrue, the constraint is that the only entry of Ipost[:,j,k] that is equal to 1 must be 1 in Itrue as well.

# For each j, k, choose 1 i

        @constraint(model, [k=1:K, j=1:J], sum(Ipost[:, j, k]) == 1)

        # Nonanticipativity

        @constraint(model, [k=1:K, j=1:J], Itrue-Ipost[:, j, k] .>= 0 )

Now, I am forcing that the entry of one in Ipost[:,j,k] must be among the entries equal to one in Itrue. However, if you can think of a better way to write the constraint I would appreciate it.

The reason I cannot plug-in Itrue instead of Ipost is that I am allowing Itrue to have more than one entry equal to 1. However, for each j only one entry of Itrue might enter in the sum in the objective function.

One way to think about the problem is that you are a firm that want to serve many markets, but for doing that, the firm needs to et up distribution centers. The firm will serve each market from the distribution center that maximizes the objective function only. However, which distribution center is chosen depends on the realization of \omega^{k}

1 Like