Is there a functional substitute for JuMP.@variable?

I think there is a basic criterion:

  • macros should yield to functions whenever possible

I know it is necessary to have macros build expressions, e.g.

JuMP.@expression(m, sum(c[i]x[i] for i = 1:I))

since the last arg may be a highly complicated hand-written construct that needs to be parsed.

But I don’t think this applies to variables: I almost only need regular Array container for variables, so the desired return has the same shape as, say, rand() or rand(3), rand(3, 4), rand(2, 3, 4)
corresponding to
@variable(model),
@variable(model, [1:3]),
@variable(model, [1:3, 1:4]),
@variable(model, [1:2, 1:3, 1:4])

Yes, I also think that register a name into the model is not a must. Therefore I think the anonymous syntax above is fairly sufficient. So we may find that the only “demanding” part in @variable is the container, whose utility is not very indispensable.

So the question is:

  • Can I use a functional API, to achieve the identical effect and performance as a macro API?

Say, I have already defined a A::Matrix{Any} with size(A) == (2, 3), e.g. via A = rand(2, 3). And I want to add a similar-shape decision variable. Following the existing fashion I’ll have to write

z = @variable(model, [1:2, 1:3], integer = true)

Note that the [1:2, 1:3] part requires hand-written—not amenable to program manipulation (otherwise can anyone teach me how to?:smiling_face_with_tear:).
Now I wonder if there is an alternative so I can write

z = my_add_variable(model; size = (2, 3), integer = true)

or even simpler

z = my_add_variable(model; similar_array = A, integer = true)

instead? I think there is not any technical hurdles here, right? Since the lowered tasks of @variable are still functions.

I’m somewhat sure to assert that there is no way to programmatically call this macro without your hand-writing [1:2, 1:3]. I could have done something like map(_ -> @variable(model), A) where A = rand(2, 3) to bypass, but this option is slower than the hand-written version.

how about this?

julia> A = rand(2, 3)
2×3 Matrix{Float64}:
 0.561848  0.182714   0.456506
 0.728612  0.0833966  0.689414

julia> @variable(model, [axes(A, 1), axes(A, 2)], Int)
2×3 Matrix{VariableRef}:
 _[8]  _[10]  _[12]
 _[9]  _[11]  _[13]

Thanks, but you are assuming A::Matrix, while I only assumed A::Array.
Therefore your method won’t apply for another instance where A = rand(2, 3, 4).

You may want MathOptInterface.jl instead of JuMP.jl.

In general in Julia you can use metaprogramming to generate expressions, etc. But in this case you probably just want MOI.

The second arg in @variable is so complex that I cannot fill it properly even if a macro is used.
So I think a functional API should do better. Besides, there are keywords options in @variable, after all, e.g. container=Array.

About MOI: I used to use that package even before using JuMP. I think the code written by JuMP is shorter and easier to modify and test. So I almost use JuMP only at present, just like using julia instead of Matlab that I somehow used to use in the past : )

You’re welcome to write your own container. Just use @variable(model) to create each scalar element:

julia> using JuMP

julia> function my_add_variable(model; size::NTuple, integer::Bool)
           return map(Iterators.product(Base.OneTo.(size)...)) do _
               return @variable(model, integer = integer)
           end
       end
my_add_variable (generic function with 1 method)

julia> model = Model()
A JuMP Model
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> z = my_add_variable(model; size = (2, 3), integer = true)
2×3 Matrix{VariableRef}:
 _[1]  _[3]  _[5]
 _[2]  _[4]  _[6]
julia> function test()
           I, J, K = s = 10, 30, 20
           A = rand(I, J, K)
           m = Model()
           @btime map(_ -> @variable($m, integer=true), $A)   
           @btime map(_ -> @variable($m, integer=true), $(CartesianIndices(A)))
           @btime map(_ -> @variable($m, integer=true), Iterators.product(Base.OneTo.($s)...))
           @btime map(_ -> @variable($m, integer=true), Iterators.product(axes($A)...))
           @btime map(_ -> @variable($m, integer=true), Iterators.product(map(Base.OneTo, $s)...))
           @btime @variable($m, [1:$I, 1:$J, 1:$K], Int)
       end
test (generic function with 1 method)

julia> test()
  812.139 μs (12003 allocations: 750.09 KiB)
  812.419 μs (12003 allocations: 750.09 KiB)
  812.050 μs (12003 allocations: 750.09 KiB)
  814.569 μs (12003 allocations: 750.09 KiB)
  816.289 μs (12003 allocations: 750.09 KiB)
  384.629 μs (12003 allocations: 750.09 KiB)

I used BenchMarkTools.jl, run on a server machine to derive the results.
I guess you have some optimized method that the built-in Array method [1:I, 1:J] is faster.