@NLconstraint with a sum, array and scalar variables

Hello and happy new year!

Below is a toy problem.
It deals with a mix of 4 chemicals.
The quantities of the chemicals are given in the array xs.
The temperature of the mix is the only unknow, subject to the constraint (h(T)==h0).

I would like to generalize this problem.
The composition xs should also become unknowns.
There would be additional constraints and/or some objective to determine this composition.
Furthermore, the list of chemicals could change depending on the application, also in its size.

I tried several ideas to implement this first generalization.
In particular I tried using add_NL_constraint with some cryptic expressions.
I did not succeed (except when expanding the 4 terms, which doesn’t help for generalization).

A further generalization would involve several mixes with their own temperatures and compositions.
The whole thing might finally represent a chemical process and some constrained optimization of this process.

I would appreciate your suggestions to help me perform the first generalisation step, or more …

Thanks,

Michel

using JuMP
using Ipopt

hC(T) = 16259. - 569915.0/T - 1772.5 *sqrt(T) + 55.6627 *T - 0.00391541 *T^2 + 3.96106*10^-7 *T^3
hO2(T) = 4192.37 - 612446.0/T - 1042.28 *sqrt(T) + 53.7523 *T - 0.00194002 *T^2 + 2.71124*10^-7 *T^3
hN2(T) = -476.976 - 507331.0/T - 541.1 *sqrt(T) + 38.1673 *T + 0.00164584 *T^2 - 2.34068*10^-7 *T^3
hCO2(T) = -380911. - 803232.0/T - 2140.86 *sqrt(T) + 91.4313 *T - 0.00225023 *T^2 + 1.38949*10^-7 *T^3

function test(fs)
    xs = [1.0, 1.0, 3.76, 0.0]

    h(T) = sum(x*f(T) for (x,f) in zip(xs,fs))
    h0 = h(321)

    model = Model(Ipopt.Optimizer)
    set_silent(model)
    JuMP.register(model, :h     , 1, h      ; autodiff=true)

    @variable(model, T, start = 300)
    @NLconstraint(model, h(T) ==  h0)      # <<<<  the constraint to be generalized

    JuMP.optimize!(model)
    Tad = JuMP.value(T)
    Tad, h(Tad), h0
end

fs = [hC,hO2,hN2,hCO2]
test(fs)

You need to use JuMP.set_start_value() function to provide initial value to the variable T. The following code works.

using JuMP
using Ipopt

hC(T) = 16259. - 569915.0/T - 1772.5 *sqrt(T) + 55.6627 *T - 0.00391541 *T^2 + 3.96106*10^-7 *T^3
hO2(T) = 4192.37 - 612446.0/T - 1042.28 *sqrt(T) + 53.7523 *T - 0.00194002 *T^2 + 2.71124*10^-7 *T^3
hN2(T) = -476.976 - 507331.0/T - 541.1 *sqrt(T) + 38.1673 *T + 0.00164584 *T^2 - 2.34068*10^-7 *T^3
hCO2(T) = -380911. - 803232.0/T - 2140.86 *sqrt(T) + 91.4313 *T - 0.00225023 *T^2 + 1.38949*10^-7 *T^3

function test(fs)
    xs = [1.0, 1.0, 3.76, 0.0]
    h(T) = sum(x*f(T) for (x,f) in zip(xs,fs))
    h0 = h(321)
    model = Model(Ipopt.Optimizer)
    set_silent(model)
    JuMP.register(model, :h, 1, h; autodiff=true)
    @variable(model, T)
    JuMP.set_start_value(T, 300)
    @NLconstraint(model, h(T) ==  h0)      # <<<<  the constraint to be generalized
    JuMP.optimize!(model)
    Tad = JuMP.value(T)
    return (Tad, h(Tad), h0)
end

fs = [hC,hO2,hN2,hCO2]
Tad, hh, h0 = test(fs)

Thanks a lot for your care Mahbubar06.

The original code had the initialization in the @variable statement.
On my computer, it worked perfectly.

My concern was about how to make this code more general and more flexible.
In the toy problem above, I assumed the xs vector is a “constant”, not variables.
In general, I need the xs to be variables as well as T.
In addition, the list of chemical substances should not be known in advance, unlike my example.
The difficulty is that JuMP does not allow more than 1 argument for any function in a model.
Therefore, this would not work currently with JuMP:

        ...
        @variable(model, x[1:4])
        ...
        h(xs,T) = sum(x*f(T) for (x,f) in zip(xs,fs))
        ...
        @NLconstraint(model, h(xs,T) ==  h0)
        ...

However, JuMP allows any number of variables in an expression used in a contraint or in an objective.
Therefore, the code below works without problem even thoguh now the “xs” are JuMP variables:

using JuMP
using Ipopt

hC(T) = 16259. - 569915.0/T - 1772.5 *sqrt(T) + 55.6627 *T - 0.00391541 *T^2 + 3.96106*10^-7 *T^3
hO2(T) = 4192.37 - 612446.0/T - 1042.28 *sqrt(T) + 53.7523 *T - 0.00194002 *T^2 + 2.71124*10^-7 *T^3
hN2(T) = -476.976 - 507331.0/T - 541.1 *sqrt(T) + 38.1673 *T + 0.00164584 *T^2 - 2.34068*10^-7 *T^3
hCO2(T) = -380911. - 803232.0/T - 2140.86 *sqrt(T) + 91.4313 *T - 0.00225023 *T^2 + 1.38949*10^-7 *T^3
fs = [hC,hO2,hN2,hCO2]

function test(hs)
    model = Model(Ipopt.Optimizer)
    @variable(model, xs[1:4])
    @variable(model, T, start = 300)
    @NLconstraint(model, xs[1] + xs[2] ==  2.00)   #<<<  just an example
    @NLconstraint(model, xs[2] + xs[3] ==  4.76)   #<<<  just an example
    @NLconstraint(model, xs[3] + xs[4] ==  3.76)   #<<<  just an example
    @NLconstraint(model, xs[4]             ==  0.00)   #<<<  just an example

    [JuMP.register(model, Symbol(h), 1, h; autodiff=true) for h in hs]

    h0 = 3365.249312049329
    @NLconstraint(model, xs[1]*hC(T)+xs[2]*hO2(T)+xs[3]*hN2(T)+xs[4]*hCO2(T) ==  h0)

    JuMP.optimize!(model)
    JuMP.value(T), h0, JuMP.value.(xs)
end

test(fs)

But it is not enough.
For my purpose, the list of substance (hs) will only be known at run time.
For example, the same test(hs) function should work for other cases as here:

test([[hC,hO2,hN2,hCO2]])
test([[hC,hO2]])
test([[hC,hO2,hN2,hCO2,hCaCO3,hSiO2,hFe2O3,hCaO,hAl2O3,hNa2O]])

Of course, the xs vector should size accordingly, and the constraints or objective functions should be more general than in the example. The constraints could be the mass balances of the atoms involved in the substances and the objective could be related to the cost of the substances and there could be still other constraints. To keep things simple, I don’t go into these details.
However, the nonlinear constraint should be modified from

    @NLconstraint(model, xs[1]*hC(T)+xs[2]*hO2(T)+xs[3]*hN2(T)+xs[4]*hCO2(T) ==  h0)

to something like:

    @NLconstraint(model, sum(x*h(T) for (x,h) in zip(xs,hs)) ==  h0)

If you try this last version, it will not be understood by JuMP as it returns the following error:

Unrecognized function "h" used in nonlinear expression.

This suggests that JuMP takes the symbol “h” literally causing a misunderstanding since no function “h” has been registered in the model. You can check for example that this replacement will run properly although it has no physical meaning:

    @NLconstraint(model, sum(x*hC(T) for (x,h) in zip(xs,hs)) ==  h0)

The fact that this statement can be understood suggest also that the sum might work.
But unfortunately, I could not find any way to have this working or any alternative:

    @NLconstraint(model, sum(x*h(T) for (x,h) in zip(xs,hs)) ==  h0)

For example this alternative statement does not work either:

    @NLconstraint(model, sum(xs[i]*hs[i](T) for i in 1:4) ==  h0)

it return this error message that I can’t understand:

LoadError: MethodError: no method matching _is_sum(::Expr)

Note that the same error is retruned when trying this statement:

    @NLconstraint(model, xs[1]*hs[1](T)+xs[2]*hs[2](T)+xs[3]*hs[3](T)+xs[4]*hs[4](T) ==  h0)

Even this physically meaningless attempt also returns the same error:

    @NLconstraint(model, xs[1]*hs[1](T) ==  h0)

I tried many other ways to make JuMP understand my problem in a less “literal” way so that I could make a more general and flexible program.
Mainly I tried everything I could from metaprogramming especially:

  • using add_NL_constraint with expressions including appropriate extrapolation
  • and even using Meta.parse with strings representing what I would need, in the hope that assembling strings would be a way to make progress. Unfortunately, I could not inject the parsed expression in a constraint.

As you can see, I do need some suggestions …
Without good suggestion, I would probably forget about Julia as I already did 4 years ago.
I would deeply regret it.
But with Pyomo in Python, all these things that I described here, I could do them very easily, and much more. I can guess why this is so, in particular given the overloading of operators implemented for Pyomo variables and expressions.
I really hope I find a way to do the same with Julia and JuMP, because of the other advantages.

Thanks

You can use the splatting operator ... to pass a variable number of arguments.

using JuMP
using Ipopt

function test(hs)
    N = length(hs)
    function foo(T, xs...)
        return sum(x * h(T) for (x, h) in zip(xs, hs))
    end
    model = Model(Ipopt.Optimizer)
    @variable(model, xs[1:N] >= 0)
    @constraint(model, sum(xs) == 1.0)
    @variable(model, T >= 0, start = 3)
    register(model, :foo, 1 + N, foo; autodiff = true)
    h0 = 3365.249312049329
    @NLconstraint(model, foo(T, xs...) == h0)
    optimize!(model)
    return value(T), h0, value.(xs)
end

hC(T) = 16259. - 569915.0/T - 1772.5 *sqrt(T) + 55.6627 *T - 0.00391541 *T^2 + 3.96106*10^-7 *T^3
hO2(T) = 4192.37 - 612446.0/T - 1042.28 *sqrt(T) + 53.7523 *T - 0.00194002 *T^2 + 2.71124*10^-7 *T^3
hN2(T) = -476.976 - 507331.0/T - 541.1 *sqrt(T) + 38.1673 *T + 0.00164584 *T^2 - 2.34068*10^-7 *T^3
hCO2(T) = -380911. - 803232.0/T - 2140.86 *sqrt(T) + 91.4313 *T - 0.00225023 *T^2 + 1.38949*10^-7 *T^3

fs = [hC, hO2, hN2, hCO2]
test(fs)

:rofl:
I can’t believe it !
I spent so much time on this … even metaprogramming !
And there was such a simple solution !
Thanks a lot !
:+1:

Should I understand that here splatting turns all the arguments into one vector?
:ferris_wheel: