Finding the root of a minimize-maximize function

Let

f1(x,y)=-(1 - x)* y + 1/2* (1 - x - x^2 - x^3 - 2* (1 - x)* y)
f2(x,y)=(1 - 2*x)*(1 + x + x^2) - 2*(-1 + x)^2*(1 + x)*y

We can define a minimize, maximize function like this

using IntervalOptimization
g(x) = minimize(y->max(f1(x,y), f2(x,y)), 0..1)[1]

What I am looking for is to find the root of g(x) on [0,1].

In this particular case, I can actually solve it symbolically. But I was wondering if for similar problem with more variables and more polynomials, if numeric method offers a better solution.


Another way to phrase the problem is like this. Given f1(x, y) and f2(x, y) finding the minimal x such that there exists a y for which f1(x, y) < 0 and f2(x, y) < 0.

BTW, symbolically, this can be solved with Cylindrical Decomposition. Although it can get very slow when the problem gets biggers.
http://www.algebra.uni-linz.ac.at/people/mkauers/publications/kauers11a.pdf

1 Like

Using the Positivstellensatz, you know that the set {(x, y) | f1(x, y) <= 0, f2(x, y) <= 0, x = a} is empty if and only if there exists f in the preorder generated by -f1, -f2 and h in the ideal generated by x - a such that f + h = -1.
You can formulate it as a Sum-of-Squares program with
s_1 * f1 + s_2 * f2 + s_3 * f1 * f2 + p * (x - a) = -1 where s_i are SOS polynomials and p is any polynomial.
To solve it numerically, you will need to fix the maximal degree of the expression but you can build a hierarchy of SOS programs with higher and higher maximal degree.
An issue with this is that if the coefficients of p are JuMP decision variables and a is also a JuMP decision variable, then p * (x - a) will be bilinear in the JuMP decision variables. Since you have SDP constraints (to constraint the s_i to be SOS) and bilinear constraint, it becomes tricky to solve. You will need to mix an SDP solver with a meta-solver like GitHub - lanl-ansi/Alpine.jl: A Julia/JuMP-based Global Optimization Solver for Non-convex Programs to handle bilinearity.
However, if you fix a, e.g. you are commuting g(a) and use a separate method that find the root of g given an oracle for g(a) then you don’t have any bilinearity issues.

Here is how it would look like. First define your polynomials:

using DynamicPolynomials
@polyvar x y
f1 = -(1 - x)* y + 1/2* (1 - x - x^2 - x^3 - 2* (1 - x)* y)
f2 = (1 - 2*x)*(1 + x + x^2) - 2*(-1 + x)^2*(1 + x)*y

You want s * p to have degree d, hence s to have degree d - deg(p). As s is SOS, it has even degree. Taking all this into account, you get something like this to create s:

function s(model, p, d)
    deg = div(d - maxdegree(p, y) + 1, 2)
    monos = monomials([y], 0:deg)
    return @variable(model, variable_type=SOSPoly(monos))
end

Now you can combine all this to create your oracle for g.

function g(a, d, solver = CSDP.Optimizer)
    f_1 = subs(f1, x => a)
    f_2 = subs(f2, x => a)
    model = Model(solver)
    set_silent(model)
    p = sum(s(model, f, d) * f for f in [f_1, f_2, f_1 * f_2])
    @constraint(model, p == -1)
    optimize!(model)
    status = termination_status(model)
    if status == MOI.OPTIMAL
        return false
    elseif status == MOI.INFEASIBLE
        return true
    else
        error("Termination status: $status.")
    end
end

To test it, suppose you want to evaluate g at the given values of x: x_values = range(0, stop=1, length=10).
With degree d, you get g.(x_values, 2) which gives true for all values except for x = 1.
Climbing up the hierarchy, with d = 6: g.(x_values, 6), you get the same except that you also get false for an additional values before x = 1.
With d = 10, you get false for the last three values of x.
Accepting the status MOI.ALMOST_OPTIMAL as false, with d = 20, you get false for the last 4 values of g.

EDIT: I just realized that in the implementation, I forgot we need to take -f1 and -f2 insted of f1 and f2 as mentioned in the beginning.

3 Likes