(Julia beginner)
Still thinking about Nonlinear Programming with JuMP with some generality.
(@NLconstraint with a sum, array and scalar variables)
There are many NLP JuMP examples available but all more or less similar to the Rosenbrok problem in the sense that they involve a few explicit functions for objective and constraints.
My intuition for my own problem (pyro chemistry) is that I might need to create many nonlinear functions and that I would use as part of an array of functions or a dictionary of functions.
Therefore when registering some new function(s) with some symbol(s) like here:
there is the problem that I will probably never be able to use literally the hJuMP symbol created in this statement. Instead, my likely use of hJuMP would be as one element from a list of such function:
using JuMP, Ipopt
function main(functions)
model = Model(Ipopt.Optimizer)
@variable(model, x)
for (i, f) in enumerate(functions)
f_sym = Symbol("f_$(i)")
register(model, f_sym, 1, f; autodiff = true)
add_NL_constraint(model, :($(f_sym)($x) <= 1.0))
end
return model
end
main([
x -> x^2,
x -> sin(x)^2,
])
How would you modify the last line of this snippet to involve a sum() on these two functions ?
model = Model(Ipopt.Optimizer)
f_sym = [:f_1, :f_2]
f_fun = [x -> x^2, x -> sin(x)^2]
@variable(model, x)
for (s, f) in zip(f_sym,f_fun)
register(model, s, 1, f; autodiff = true)
end
#@NLconstraint(model, f_1(x) + f_2(x) <= 1.0 )
add_NL_constraint(model, :($(f_sym[1])($x) + $(f_sym[2])($x) <= 1.0) )
As a beginner, I get already confused when trying to adjust this (very) naïve attempt:
add_NL_constraint(model, :( sum( $(f_sym[i])($x) for i in 1:2) <= 1.0) )
or this variant:
add_NL_constraint(model, :( sum( (($f)($x)) for f in f_sym) <= 1.0) )
Could you also suggest some readings on metaprogramming that deal with this kind of yoga with expressions? Explaining the mechanics of it a bit more in detail than what I found already.