Polynomial optimization without sum of squares

TL-DR: I would like to solve a linear program, but the problem itself involves multivariate polynomials. So I need tools to transform polynomials to their vectors of coefficients and back. Moreoever, in the end, after solving the corresponding linear program, I would like to reinterpret the solution as a ceritificate of positivity of the polynomial I am interested in, over the region of interest (defined using linear inequalities).

While I can probably write it from scratch, yet most of the functionality I need is probably already being used in SumOfSquares or Polynomials or in some other Julia package. Some comments would be appreciated.

Suppose you have 3 variables, x, y, z, say, and linear inequalities of the form

0 <= x <= 1
0 <= y <= 1
0 <= z <= 1
1 <= x + y + z <= 2

This gives us a number of nonnegative polynomials,

x
y
z
1-x
1-y
1-z
x+y+z-1
2-x-y-z

I would like to form the vector of all polynomials of degree up to 2 in the previous polynomials. This should for example contain 1, x, y, z, 1-x, etc., but also x^2, xy, x(1-x) and so on. Let us call this vector X, say.

I am given a polynomial p

p = x*(1-y) + y*(1-z) + z*(1-x)

I am interested in the problem of writing, if possible, p as a linear combination of the polynomials in X with nonnegative coefficients and, moreover, finding such an expression with the highest possible coefficient of 1.

Note that I specifically avoided mentioning sums of squares, because in the problem I am interested in, each variable actually represents a matrix, the product is the symmetric Kronecker product and the inequality represents the Loewner order. As a result, squares of polynomials in x, y and z are not necessarily nonnegative.

That being said, I wonder whether the above problem can be solved in Julia, for example using Jump and SumOfSquares or some other packages. I looked for a while in the documentation, but could not yet find what I want. If you know how I can solve the problem above, could you please help, or maybe give some clues? Thank you.

Hi @joemaik, welcome to the forum :smile:

I don’t know the answer, so this is a question for @blegat.

Hi @odow, I wrote the code from scratch, by basically defining the cone of nonnegative polynomials and converting my problem to a linear program which I then solved, using JuMP and an LP solver (HiGHS).

2 Likes

Would you mind showing us the code?

1 Like
using DynamicPolynomials
using JuMP
using HiGHS

@polyvar x[1:3]

P = x[1] * (1 - x[2]) + x[2] * (1 - x[3]) + x[3] * (1 - x[1])

S = [x[1], 1-x[1], x[2], 1-x[2], x[3], 1-x[3], x[1]+x[2]+x[3]-1, 2-x[1]-x[2]-x[3]]

poly_basis = monomials([x[1], x[2], x[3]], 0:2)

function coeffs(poly, poly_basis)
    return [DynamicPolynomials.coefficient(poly, monom) for monom in poly_basis]
end

function build_poly(coeffs, poly_basis)
    return sum(coeffs .* poly_basis)
end

build_poly(coeffs(P, poly_basis), poly_basis) == P

@polyvar p[1:length(S)]

constraints_short = monomials([p[i] for i in 1:length(S)], 0:2)

my_subs = tuple((p[i] for i in 1:length(S))...) => tuple((S[i] for i in 1:length(S))...)

constraints = [subs(c, my_subs) for c in constraints_short]

A = zeros((length(poly_basis), length(constraints)));

size(A)

for j in 1:length(constraints)
    A[:, j] = coeffs(constraints[j], poly_basis)
end

rhs = coeffs(P, poly_basis)

model = Model(HiGHS.Optimizer)
@variable(model, c[1:length(constraints)] .>= 0)
@objective(model, Max, c[1])
@constraint(model, A*c == rhs)

print(model)

optimize!(model)

c_opt = [value(c[i]) for i in 1:length(c)];

certificate = sum(c_opt .* constraints_short)

The substitution part is slow though (not in the example above, but in some higher degree examples). I wonder if I can optimize that part of the code…

What you are doing (multiplying x with 1 - y to get a new nonnegative polynomial) is called the Schmüdgen certificate, see Certificate · SumOfSquares.
The quadratic case where you take pairwise products of linear functions is a special case of Schmüdgen.
In general, you can multiply by SOS polynomials but in this case, because your products are already of degree 2, the SOS polynomial should have degree 0 which mean a nonnegative constant, this is your c.
Not quite sure why the substitution is slow but you could do a code a bit more specialized to speed it up like iterating over all pairs.
Using the Schmüdgen example I mentioned above with a maxdegree = 2 might be faster, you can try.

1 Like

Thank you @blegat! Yes, I have actually read the Schmüdgen certificate page and the code you have mentioned. You are right in that in this specific example, there is no need to “turn off” the SOS polynomials, but I am also interested in higher degree examples, for which it is important for me not to use SOS polynomials. And I don’t understand enough the code related to the Schmüdgen certificate to modify it so that it only forms the products of the nonnegative polynomials (on the region) I have provided and no SOS polynomials (in my ring, squares of arbitrary polynomials are not necessarily nonnegative). Is it please possible to modify the code for the Schmüdgen certificate so that it does not form SOS polynomials but only products of the nonnegative polynomials I have provided?