I am doing some optimisation, the problems all look the same but the number of variables differ. I am trying to create a function which will do the optimisation problem in one line (optimise_this(some_problem)
). The problem I am having is that I do not know how to register an arbitary number of functions as constraints.
What I have is basically this:
function optimise_problem(constraints_functions::Vector{Function},gradients::Vector{Function})
n = length(constraints_functions)
m = Model(solver = IpoptSolver())
@variable(m, x[1:n] >= 0.)
for i = 1:n
JuMP.register(m, :fun_i, n, constraints_functions[i], gradients[i])
@NLconstraint(m, fun_i(x...)==0)
end
@objective(m, Max, 1)
return getvalue(x)
end
(well, this is more solving than optimisation, but anyway)
The first problem is that I am a bit unsure how to correctly give the entire x vector into my function in @NLconstraint(m, fun_i(x...)==0)
. Here I try x...
which does not seem to work.
The major problem is how to actually declare the function so that I can give an arbitrary (unique) name and then add it as a constraint. Setting :fun_i
as in my example obviously does not work (but easiest way to illustrate what I wanted to do). If I unfolded the for loop and did it manually I could just write
JuMP.register(m, :fun_1, n, constraints_functions[1], gradients[1])
@NLconstraint(m, fun_1(x...)==0)
JuMP.register(m, :fun_2, n, constraints_functions[2], gradients[2])
@NLconstraint(m, fun_2(x...)==0)
...
The first one I could fix properly using JuMP.register(m, Symbol(:fun_,1), n, constraints_functions[1], gradients[1])
. However, the declaration in the constraint does not allow for this. I have been think of metaprogramming somehow. But using eval()
have a lot of implications and always feels dangerous, especially when it might not bee needed. I was thinking of using a macro, but that shouldn’t work either, right? (since the macro cannot see the content of the dim variable it wouldn’t either be able to make the fun_i(x…) properly in the constraint either, but maybe I am wrong).
the best I can come up with is just to write an humongous function like
function set_constraints(m,i,n,constraints_functions,gradients)
JuMP.register(m, :fun_1, n, constraints_functions[1], gradients[1])
@NLconstraint(m, fun_1(x...)==0)
(i==1) && return
JuMP.register(m, :fun_2, n, constraints_functions[2], gradients[2])
@NLconstraint(m, fun_2(x...)==0)
(i==2) && return
...
JuMP.register(m, :fun_1000, n, constraints_functions[1000], gradients[1000])
@NLconstraint(m, fun_1000(x...)==0)
(i==1000) && return
end
and then I just ensure to have more entries than what I might need… However, it feels beyond ludicrous if this is the only way to do it…
It doesn’t feel like what I want to do is fundamentally problematical, but I am not sure how to get around the interface provided by JuMP to allow be to actually do it. However I am quite new to using JuMP so so figured I’d ask here and see if someone more familiar with it could help.