SumOfSquares, JuMP; Implementing semidefinite constraint with specified domain

Hello all,

I am using the SumOfSquares package to obtain Lyapunov functions with some particular properties. Given a polynomial vector field defined by \dot{x} = f(x), I wish to obtain Lyapunov functions V(x) such that if f(x) is decomposed as,

f(x) = -\nabla V + f_V(x),

the decomposition defined by V(x) satisfies,

f_V \cdot \nabla V \leq 0.

This constraint on the properties of V can be rewritten as,

\nabla V \cdot f + \nabla V \cdot \nabla V \le 0,

which is nonlinear in V. As I understand it, such a nonlinear constraint cannot be implemented. However, this can be rewritten as a linear matrix inequality as,

M_V = \begin{bmatrix} -\nabla V \cdot f, \, \nabla V \\ \nabla V ^\top, \, I_n \end{bmatrix} \succeq 0

The code at the bottom provides a minimal working example, and is able to implement this requirement and obtain the Lyapunov function for some cases. However, I now need to apply the same method but limited to certain domains, (e.g. x > 0). While I can specify a domain using @polyconstraint, such a domain cannot be included in an @SDconstraint. My questions are therefore:

a. Is it possible to implement a nonlinear polyconstraint in a sum of squares problem, and thereby include a specified domain?
b. Is it possible to implement a SDconstraint that includes a specified domain? It seems that this is not currently implemented in JuMP, but is this mathematically feasible?

Any ideas would be much appreciated, many thanks in advance.
Rowan

using SumOfSquares, JuMP, PolyJuMP, DynamicPolynomials, MultivariatePolynomials, CSDP

@polyvar X[1:2]
F(x::Vector) = [-x[1] + 2.0x[2]^2; -x[1]*x[2] - 2.0x[2]];
f = F(X);

V = Lyapunov(f,X)

function Lyapunov(f, x)
    
    n = length(x);

    m = SOSModel(solver=CSDPSolver());
    @variable m ϵ
    @polyvariable m V monomials(x,2);

    # Positive definiteness constraint
    @polyconstraint m V ≥ ϵ*sum(x.^2);
    @constraint m ϵ ≥ 0

    # Apply matrix constraint, ∇U⋅fᵥ ≤ 0.
    I = eye(x);
    Mv = [-dot(differentiate(V, x),f); differentiate(V,x)];
    for ii=1:n; Mv = hcat(Mv, [differentiate(V,x)[ii];I[:,ii]]); end
    @SDconstraint m Mv ⪰ 0 # Mv positive definite

    status = solve(m);
    @show(status)

    return getvalue(V)
end

function eye(x)
    n = length(x);
    v = [1.0+0.0x[1]];
    for i = 1:n-1
        push!(v,1.0+0.0x[1]);
    end
    return diagm(v)
end
2 Likes

a. You can have Sum of Squares constraints with quadratic coefficients. This is not part of v0.1.0 but it works on master thanks to this commit: https://github.com/JuliaOpt/SumOfSquares.jl/commit/5ea0312790e9539ff6e71988321564f2c5fb03de. You need a solver supporting quadratic constraints over the cones you use though. There is currently no solver supporting quadratic constraints and SDP cone but if you use DSOS (you can find an example here), you will have quadratic constraints over the LP cone which can be handled by e.g. Ipopt. Solution b. is probably better though.
b. @SDconstraint does not support keywork arguments indeed, I have just opened an issue for this, thanks for reporting it! However @SDconstraint m x >= 0 is equivalent to @constraint m x in PSDCone() so you can do @constraint(m, Mv in PSDCone(), domain = @set x >= 0).

Thanks a lot for this. Solution b implements what I needed.