# Monomials with variable of a model

Does there exist a nice way to use a certain variable of a model (say,

model = Model(COSMO.Optimizer)
@variable(model, u)
@variable(model, t)

)
to get a vector of monomials of [u,t] up to certain degree (say, 10) with some nice command like
monomials([u,t], 0:10)?

The error I get is “no method matching monomials(::VariableRef, :: Vector{int64}) Closest candidates are monomials(::MultivariatePolynomials.AbstractVariable, ::Any…)

It would be nice to have such a short command and not to write a vector of monomials by induction (since I have more than 2 variables in monomial vector).

Moreover, if I already have a vector of polynomials with the variable “u” of the model (say,
p =Array{Polynomial}(undef, 2)) and if I want to write a product p*p’A, where A is a matrix 22 and A is matrix variable of a model (@variable, A[1:2, 1:2], PSD), then I get an error: no method matching *(::AffExpr, ::Polynomial{Float64}).

All in all, the only thing I need is to put into constraint of the model an expression which is the trace of the product of a matrix which is a variable of the model by the matrix which is obtained from monomial or polynomial vector of a variable “u” of the model.

Does there exist a nice way

Have you seen Variables · SumOfSquares?

If that isn’t what you want, and you want a vector of monomials of JuMP variables, like `[u, u^2, u^3, ...]`, use:

``````@NLexpression(model, [i=0:10], u^i)
``````

But this will create a vector of nonlinear expressions, so you cannot use a conic solver like COSMO.

All in all, the only thing I need

So to clarify, you want something like this?

``````using JuMP, LinearAlgebra
model = Model()
@variable(model, u)
@variable(model, A[1:2, 1:2], PSD)
p = [u, u^2]
@constraint(model, tr(p * p' * A) == 1)
``````

Yes, I need smth. like this (but with much more variables ) :

using JuMP, LinearAlgebra
model = Model()
@variable(model, u)
@variable(model, A[1:3, 1:3], PSD)
p = [1, u, u^2]
@constraint(model, tr(p * p’ * A) == 1)

JuMP isn’t setup to solve problems like this. You could write:

``````using JuMP
model = Model()
@variable(model, u)
@variable(model, A[1:3, 1:3], PSD)
@NLexpression(model, p[i=1:3], u^(i-1))
@NLconstraint(
model,
sum(sum(p[i] * p[j] * A[j, i] for j in 1:3) for i in 1:3) == 1
)
``````

but now you’ll need a solver that supports PSD constraints and nonlinear equality constraints.

You could get rid of the explicit PSD constraint using Cholesky:

``````using JuMP, LinearAlgebra
model = Model()
@variable(model, u)
@variable(model, L[1:3, 1:3], Symmetric)
A = Matrix(LowerTriangular(L)) * Matrix(LowerTriangular(L))'
@NLexpression(model, p[i=1:3], u^(i-1))
@NLconstraint(
model,
sum(p[i] * sum(p[j] * A[j, i] for j in 1:3) for i in 1:3) == 1
)
``````

but this will be really hard to solve.

Where does the problem come from? How do other people solve it?