[JuMP] Container Functionality Feature Requests

I have been using with JuMP’s container types as I am developing my JuMP extension and have two observations that I think would make for 2 relatively simple additions.

Observation 1: According to the extension of Base.keys(::IndexAnyCartesian, d::SparseAxisArray) here JuMP.jl/SparseAxisArray.jl at master · jump-dev/JuMP.jl · GitHub, Base.keys(d) should be defined but it is not:

using JuMP; const JuMPC = JuMP.Containers
JuMPC.@container(a[i = 1:2, j = 1:2; i + j <= 3], 2)
keys(a)
ERROR: MethodError: no method matching size(::JuMP.Containers.SparseAxisArray{Int64,2,Tuple{Int64,Int64}})
Closest candidates are:
  size(::AbstractArray{T,N}, ::Any) where {T, N} at abstractarray.jl:38
  size(::BitArray{1}) at bitarray.jl:77
  size(::BitArray{1}, ::Integer) at bitarray.jl:81
  ...
Stacktrace:
 [1] axes at .\abstractarray.jl:75 [inlined]
 [2] keys(::JuMP.Containers.SparseAxisArray{Int64,2,Tuple{Int64,Int64}}) at .\abstractarray.jl:101
 [3] top-level scope at REPL[24]:1

However, could it not just be defined as Base.keys(d::SparseAxisArray) = keys(d.data)? This would be nice to enable Base.keys as a uniform method for accessing the keys of Arrays, DenseAxisArrays, and SparseAxisArrays which is convenient in writing methods that need to iterate using the keys of a container.

Observation 2: I noticed that DenseAxisArrays can be iterated over in a for loop (as enabled by Base.iterate, but comprehension does not work for DenseAxisArrays:

using JuMP; const JuMPC = JuMP.Containers
JuMPC.@container(b[i = 2:3, j = 1:2], 2)
[i for i in b]
ERROR: MethodError: no method matching similar(::Type{Array{Int64,2}}, ::Tuple{UnitRange{Int64},Base.OneTo{Int64}})
Closest candidates are:
  similar(::AbstractArray{T,N} where N, ::Tuple) where T at abstractarray.jl:627
  similar(::Type{T<:AbstractArray}, ::Union{Integer, AbstractUnitRange}...) where T<:AbstractArray at abstractarray.jl:670
  similar(::Type{T<:AbstractArray}, ::Tuple{Vararg{Int64,N}} where N) where T<:AbstractArray at abstractarray.jl:672
  ...
Stacktrace:
 [1] _array_for(::Type{Int64}, ::JuMP.Containers.DenseAxisArray{Int64,2,Tuple{UnitRange{Int64},Base.OneTo{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}, ::Base.HasShape{2}) at .\array.jl:598
 [2] collect(::Base.Generator{JuMP.Containers.DenseAxisArray{Int64,2,Tuple{UnitRange{Int64},Base.OneTo{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}},getfield(Main, Symbol("##35#36"))}) at .\array.jl:0
 [3] top-level scope at REPL[25]:1

Again, this would be a convenient tool for creating containers based on an existing DenseAxisArray.

There are a number of open issues relating to SparseAxisArrays: Issues · jump-dev/JuMP.jl · GitHub

Take a read. We’d appreciate the help.

Thanks for the quick response. I had seen those issues, but wanted to present these 2 ideas here first as per the JuMP contribution guidelines to see if I should discuss them further via JuMP issues (where the first could probably be mentioned in one of the existing issues and the second would probably warrant a new issue). Thanks.

  1. So I just tried:
Base.keys(x::SparseAxisArray) = keys(x.data)

Not sure why we haven’t done that already. Does it pass tests? If so, PR?

  1. Not sure what that is. Not even sure where to start.
  1. I can make a PR for this and first check that the tests pass (and perhaps raise an appropriate issue first)
  2. It appears that Base.collect(::Base.Generator) (julia/array.jl at v1.4.1 · JuliaLang/julia · GitHub) and/or Base.similar need(s) to be defined for DenseAxisArrays as per Interfaces · The Julia Language. I can also raise an issue for this if desired.
  1. Not sure why Base.keys(::IndexAnyCartesian, d::SparseAxisArray) redirect to keys(d). It might be a typo and it should redirect to keys(d.data) instead. I have no objection against defining a keys for DenseAxisArray and SparseAxisArray as long as it is consistent with keys for Array:
julia> A = ones(2, 2)
2×2 Array{Float64,2}:
 1.0  1.0
 1.0  1.0

julia> keys(A)
2×2 CartesianIndices{2,Tuple{Base.OneTo{Int64},Base.OneTo{Int64}}}:
 CartesianIndex(1, 1)  CartesianIndex(1, 2)
 CartesianIndex(2, 1)  CartesianIndex(2, 2)
  1. What should [i for i in b] be ? An Array (so it’s like collect) or a DenseAxisArray (so it’s like copy) ?
  1. This is a fair point since it would be nice for the containers to act like arrays in this regard. Currently, DenseAxisArrays utilize DenseAxisArrayKeys (which is what keys returns) that more or less emulate CartesianIndices but still have a some shortcomings (Unify treatment of `keys` on JuMP containers. · Issue #1988 · jump-dev/JuMP.jl · GitHub). Should SparseAxisArrays also have their own key type? In the meantime, would it not be useful to at least have keys(d::SparseAxisArray) = keys(d.data) until a more comprehensive change is made. This way there is a method to access the unique indices of each array type. eachindex doesn’t accomplish this behavior since it returns the CartesianIndices of the underlying array of a DenseAxisArray (which loses the index information). As a use-case this comes up in my extension where the user in certain cases will need to specify a container of variable references and an associated container of support values. In such a case keys is useful to first verify that each container uses the same set of indices and then use them as needed. Since the user could specify an Array, DenseAxisArray, or SparseAxisArray I need a single method to handle these AbstractArray types.

  2. I think it would be useful to create a DenseAxisArray (that way the indices are preserved) so it acts as comprehension like a normal array. Beyond simple copying, this is useful for doing more complex operations over a DenseAxisArray than what broadcasting will allow (probably more efficient as well). This would also be nice for SparseAxisArrays too, currently they behave like dictionaries when used with comprehension-based array definition (i.e., produce a Vector of the operation results instead of another SparseAxisArray).

In the meantime, would it not be useful to at least have keys(d::SparseAxisArray) = keys(d.data) until a more comprehensive change is made.

The issue with this is that when a more comprehensive change is made, this introduce a breaking change for the user. For your use case, you can do

_keys(x) = keys(x)
_keys(x::JuMP.Containers.SparseAxisArray) = keys(x.data)

as a temporary workaround.

  1. I see, it might make sense to have a DenseAxisArray returned indeed and if the uses uses [el for el in x if ...] then we return a SparseAxisArray. Not that you can also use map which is closer to the comprehension and can be faster than broadcasting. Would that work for your use case ?
julia> using JuMP

julia> JuMP.Containers.@container(x[i=1:3, j=i:3], i + j)
JuMP.Containers.SparseAxisArray{Int64,2,Tuple{Int64,Int64}} with 6 entries:
  [1, 2]  =  3
  [2, 3]  =  5
  [3, 3]  =  6
  [2, 2]  =  4
  [1, 1]  =  2
  [1, 3]  =  4

julia> x.^2
JuMP.Containers.SparseAxisArray{Int64,2,Tuple{Int64,Int64}} with 6 entries:
  [1, 2]  =  9
  [2, 3]  =  25
  [3, 3]  =  36
  [2, 2]  =  16
  [1, 1]  =  4
  [1, 3]  =  16

julia> map(el -> el^2, x)
JuMP.Containers.SparseAxisArray{Int64,2,Tuple{Int64,Int64}} with 6 entries:
  [1, 2]  =  9
  [2, 3]  =  25
  [3, 3]  =  36
  [2, 2]  =  16
  [1, 1]  =  4
  [1, 3]  =  16

That makes sense, and yes those workarounds are close to the approaches I currently employ. The current limitation to using map as a workaround is that it doesn’t work for DenseAxisArrays:

julia> using JuMP 

julia>  JuMP.Containers.@container(x[i=1:3, j=2:3], 2)
2-dimensional DenseAxisArray{Int64,2,...} with index sets:
    Dimension 1, Base.OneTo(3)
    Dimension 2, 2:3
And data, a 3×2 Array{Int64,2}:
 2  2
 2  2
 2  2

julia> map(el -> iseven(el), x)
ERROR: MethodError: no method matching similar(::JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}, ::Type{Bool}, ::Tuple{Base.OneTo{Int64},UnitRange{Int64}})
Closest candidates are:
  similar(::AbstractArray, ::Type{T}) where T at abstractarray.jl:626
  similar(::AbstractArray, ::Type{T}, ::Union{Integer, AbstractUnitRange}...) where T at abstractarray.jl:629
  similar(::AbstractArray, ::Type{T}, ::Tuple{Vararg{Int64,N}}) where {T, N} at abstractarray.jl:637
  ...
Stacktrace:
 [1] _similar_for(::JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}, ::Type{Bool}, ::Base.Generator{JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}},getfield(Main, Symbol("##24#25"))}, ::Base.HasShape{2}) at .\array.jl:519
 [2] _collect(::JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}, ::Base.Generator{JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}},getfield(Main, Symbol("##24#25"))}, ::Base.EltypeUnknown, ::Base.HasShape{2}) at .\array.jl:0
 [3] collect_similar(::JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}, ::Base.Generator{JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}},getfield(Main, Symbol("##24#25"))}) at .\array.jl:548
 [4] map(::Function, ::JuMP.Containers.DenseAxisArray{Int64,2,Tuple{Base.OneTo{Int64},UnitRange{Int64}},Tuple{Dict{Int64,Int64},Dict{Int64,Int64}}}) at .\abstractarray.jl:2073
 [5] top-level scope at REPL[29]:1

Thus, for operations that won’t work with broadcasting on the container types, there is no unifying method that could be called on all 3 container types.

However, in response to my above comment, perhaps this definition could be added to JuMP:

function Base.map(f::Function, a::JuMP.Containers.DenseAxisArray)
    data = map(f, a.data)
    return JuMP.Containers.DenseAxisArray(data, axes(a)...)
end

Then map would at least work for all the container types:

julia> using JuMP

julia> JuMP.Containers.@container(x[i=1:3, j=2:3], 2)
2-dimensional DenseAxisArray{Int64,2,...} with index sets:
    Dimension 1, Base.OneTo(3)
    Dimension 2, 2:3
And data, a 3×2 Array{Int64,2}:
 2  2
 2  2
 2  2

julia> map(el -> iseven(el), x)
2-dimensional DenseAxisArray{Bool,2,...} with index sets:
    Dimension 1, Base.OneTo(3)
    Dimension 2, 2:3
And data, a 3×2 Array{Bool,2}:
 1  1
 1  1
 1  1

Indeed, it would be nice to make map work for DenseAxisArray. It seems it was an oversight, I thought it did. It should be fixed by Fix map for DenseAxisArray by blegat · Pull Request #2235 · jump-dev/JuMP.jl · GitHub