# Modeling sum of k-smallest eigenvalues of a symmetric matrix in JuMP

Dear All,

I am trying to model the following convex constraint in JuMP:

\sum_{i=1}^{k} \lambda_i(X) \geq c,

where X \in \mathbf{S}^n (a symmetric matrix), c is a constant, and \lambda_1(X) \leq \lambda_2(X) \leq \ldots \leq \lambda_n(X) for solving an optimization problem with X as the decision variable. The constraint represents that the sum of k smallest eigenvalues of the symmetric matrix X is greater than c.

Note that the constraint \sum_{i=1}^{k} \lambda_i(X) \geq c is a convex constraint, as the function \sum_{i=1}^{k} \lambda_i(X) is concave on X\in \mathbf{S}^n due to the Raleigh-Ritz formula (also, see the cvxpy link, where it is listed as a concave function).

Is there a way to model such a constraint in JuMP?

Any tips will be much appreciated!

This is the same as -sum of the k largest eigenvalues of X for which a reformulation is given in p. 147 of “Lectures on modern convex optimization” of Ben-Tal and Nemirovski.
So for the first k eigenvalues of X, you would do

using LinearAlgebra, JuMP
n = LinearAlgebra.checksquare(X) # Same as size(X, 1) but also check that it is square
@variable(model, Z[1:n, 1:n], Symmetric)
@variable(model, s)
@constraint(model, Symmetric(Z - X + s * Matrix(I, n, n)) >= 0)


Then k * s + tr(Z) is constrained to be an upper bound to the sum of the largest k eigenvalues. So if it’s minimized, it will be equal to it.
And if this is minimized or

2 Likes

This is a great solution, thanks so much @blegat !