I have a model with named variables as DenseAxisArray, e.g.,
using JuMP
model = direct_model(Gurobi.Optimizer())
model[:x] = @variable(model, 1 >= x[1:10, 1:5, 1:2] >= 0)
This creates a DenseAxisArray which I can access using the container methods in JuMP.
In a later function, I would like to add more x variables for a specific index, extending the variable along (at least one) dimension:
new_indices = [(11,5,2), (12,5,2)]
for (a,b,c) in new_indices
name = "x[$a,$b,$c]"
var = @variable(model, lower_bound = 0, upper_bound = 1, base_name = name)
end
Is there a way to extend or re-size the original @container? I would ideally like to be able to query model[:x][11,5,2] to receive the VariableRef to the new variable, such that I can use this in building further constraints.
Note: in my case, the extension would always be for a consecutive list of integers for each dimension index.
Are you doing column generation? Maybe try asking JuMP for anonymous new variables and organize them in julia Arrays?
PS I assume that column generation algorithm is more involved, as it not only need to modify the decision variable, but also modify the constraints. IIRC, I used to build the model and solve it inside a function, such that I rebuild a new model (and solve) at each iteration. In contrast, a pure cutting plane algorithm is much simpler to code.
In most cases, there is no easy way to extend containers like this.
For example your current x is a rectangular three-dimensional Base.Array. If you add two new variables to it, we cannot keep the same type. For example, what would x[11, 3, 1] be?
If you’re struggling with JuMP’s containers, it usually means that you have chosen the wrong container type, and a different data structure would suit you better. Choosing the better data structurer is, unfortunately, problem specific and it depends on what you are trying to do.
So: what’s your larger goal? What problem are you trying to solve?
Thanks for your thoughts and ideas!
I am indeed working on a column generation algorithm (as part of a Branch-and-Price scheme). My current approach is to build a sort of ‘base model’ at the root node, extend this with the required variables and constraints in each node, solve it, and run a ‘clean up’ to remove the added variables and constraints. I was hoping to be able to avoid re-building the model from scratch at every step, though I have not done computational experiments to see how expensive this would be.
@WalterMadelim yes, you’re absolutely right, my variables in the MWE are all Arrays. Due to some other index sets (e.g., only even numbers in a range) I end up with DenseAxisArrays sometimes.
@odow Good point. Re-examining my structure, I would actually be extending the arrays along one dimension (here, the A-dimension) with new variables, for every b in B, c in C, due to adding a new column in A, for which I need to generate the corresponding variables. Then, my code would be
new_A = [11, 12]
for a in new_A, b in B, c in C
name = "x[$a,$b,$c]"
var = @variable(model, lower_bound = 0, upper_bound = 1, base_name = name)
end
Though, as you point out, I am not yet convinced that I have chosen the best container type and data structure for my problem.
I know almost nothing about column generation so take this with a grain of salt, but in terms of just the variable itself, it seems like when you have x[a,b,c], what conceptually is going on is that you have some sets A,B,C, and the decision variables x belongs to a relation R\subseteq A\times B\times C (that is each element of R would have some x associated to it). To me this suggests thinking more in terms of relational databases, of which DataFrames would be the simplest instance of (only able to specify 1 table at a time and relations between tables are not formalized). So maybe you could have a data frame with the following signature DataFrame[:, [:a, :b, :c, :x]] and add or subtract rows to it as elements are added/subtracted from R?