Behaviour of sparse array variables in JuMP

Take as an example a vector v which is of type Boolean and has n elements. Based on this vector we build a variable array x of the same length of v , but from which only the indices corresponding with the non-zero elements of v are effectively variables.

With JuMP we could do something like this:

v = Bool[1, 1, 1, 0, 0, 1]
n = size(v,1)

 @variable(model, x[i=1:n; v[i] != 0])

The problem for me with the interaction of this variable x in the model building. If we wanted to create the following objective function:

c = [1, 2, 3, 4, 5, 6]
@objective(model, min, c'*x)

Throws the following: ERROR: DimensionMismatch(“first array has length 6 which does not match the length of the second, 4.”)
I would have expected for the variable to behave like a sparse array, with the “non-variable” entries of x as zeros. But x[4] doesn’t exist instead of being 0.

Is there any workaround to fill the empty entries with zeros? I would like to avoid for-loops, since the model I’m trying to build is already clearly written in Matrix-vector form (and is much more complex than the example).

Thank you :slightly_smiling_face:

Despite the name, SparseAxisArray isn’t really a sparse matrix. It acts more like a dictionary. Why not just make x dense and then fix it to be zero? Solvers will presolve them out.

1 Like

I have to use sparse arrays (in particular from the SparseArrays stdlib, but also some custom array types) with JuMP all the time. I’m pleased to report that this works great, but unfortunately JuMP is still missing a lot of methods for making this convenient, in particular, last I checked it was missing pretty much all efficient sparse matrix multiplications. The good news is that the default methods aren’t always that bad, and it’s pretty easy to add more.

I keep intending to make PR’s to JuMP to make it more convenient to work with sparse variables. One of the things that has stopped me is that I’m not entirely clear on where in the JuMP API these would go.

Things that are needed include:

  • The aforementioned efficient multiplication methods
  • Perhaps some macros that simplify some expressions (e.g. there are some cases in which having a single multiplication with 3 arguments for a matrix multiplication is much more efficient than two separate multiplications).
  • Convenience functions for defining sparse variables (I often use one where I pass a binary matrix that specifies where the non-zero elements are).
  • Incorporating the sparse variable definitions into the existing macros.
1 Like

Yeah, I just found about that in the documentation http://www.juliaopt.org/JuMP.jl/v0.20.0/containers/#JuMP.Containers.SparseAxisArray

I think I will do as you suggest. What would be your recommendation for the fixation? A simple constraint?

After the presolve, the solvers will simply ignore the variables, or will be there certain thightness to it?

That is exactly what I’m looking for. How are you currently implementing that? (I mean, your workaround)

See the discussion of JuMP.fix in https://www.juliaopt.org/JuMP.jl/stable/variables/#Variable-bounds-1.

Solvers will ignore variables forced to be 0.

@ExpandingMan the issue is that operations on Variables don’t preserve output types. So Variable + Variable isn’t a Variable. This contradicts widespread assumptions spread throughout base Julia code, e.g., Float64 + Float64 = Float64.

1 Like

You can easily just do SparseArrays.findnz and then just make a loop that defines scalar JuMP variables and sticks them in a matrix.

Yes, I should have mentioned, that when you create the sparse variable matrices one should create matrices of GenericAffExpr, not of VariableRef. This works really well for linear problems, I’ve yet to work it out for quadratic problems.

To be honest, the last time I really looked at the JuMP sparse dictionary stuff I found it super annoying because they did not behave enough like normal AbstractArray in the cases I needed. Granted, it’s been a while since I did anything with these, so the situation may have improved.

1 Like