JuMP docs on SDP replace `logdet` with `tr`—is this correct?

The JuMP documentation gives, as an example of a semidefinite program, the problem of finding the minimum-volume ellipse that contains a given list of ellipses centered at the origin:

minimize    trace(WX)
subject to  X >= A_i,    i = 1,...,m
            X PSD

where the weight matrix W is ttaken as simply the identity matrix.

The reference for this problem is Boyd and Vandenberghe’s Convex Optimization, sec. 8.4.1. But if you consult this text, it is clear that the correct statement of the problem is

minimize    log(det(WX))
subject to  X >= A_i,    i = 1,...,m
            X PSD

instead, because the volume of an ellipsoid is proportional to the product of its axis lengths, which in turn are proportional to the eigenvalues of the matrix. Now, since \det A = \prod \lambda_i and \operatorname{tr} A = \sum \lambda _i, where \lambda_i are the eigenvalues of A, for a well-conditioned A, these problems are good approximations of one another. But I don’t think that they are quite the same.

Further evidence: The Matlab code for this problem given on the CVX website (which is, as I understand it, a semiofficial supplement to the Convex Optimization book)

cvx_begin sdp
    variable Asqr(n,n) symmetric
    variable btilde(n)
    variable t(m)
    maximize( det_rootn( Asqr ) )
    subject to
        t >= 0;
        for i = 1:m
            [ -(Asqr - t(i)*As{i}), -(btilde - t(i)*bs{i}), zeros(n,n);
              -(btilde - t(i)*bs{i})', -(- 1 - t(i)*cs{i}), -btilde';
               zeros(n,n), -btilde, Asqr] >= 0;
        end
cvx_end

uses the function det_rootn() which is a monotonic transformation of \log \det and not of the trace.

Anyhow, I would really like to use Julia’s LinearAlgebra.logdet() function here, but it hasn’t been implemented for JuMP variables yet. Here is an adaptation of the example_min_ellipse() function from the JuMP docs I linked above, which uses logdet instead of tr:

using JuMP
import LinearAlgebra
import SCS

function example_min_ellipse_logdet()
    # We will use three ellipses: two "simple" ones, and a random one.
    As = [
        [2.0 0.0; 0.0 1.0],
        [1.0 0.0; 0.0 3.0],
        [2.86715 1.60645; 1.60645 1.12639],
    ]
    # We change the weights to see different solutions, if they exist
    weights = [1.0 0.0; 0.0 1.0]
    model = Model(SCS.Optimizer)
    set_silent(model)
    @variable(model, X[i = 1:2, j = 1:2], PSD)
    @objective(model, Min, logdet(weights * X))
    @constraint(model, [As_i in As], X >= As_i, PSDCone())
    optimize!(model)
    return objective_value(model)
end

example_min_ellipse_logdet()

gives

MethodError: Cannot `convert` an object of type AffExpr to an object of type Float64

Questions:

  • Am I correct to suspect that objective function used in the JuMP docs is wrong?
  • Is there a way to get JuMP/SCS to accept a logdet term in the objective function?
1 Like

See here too:

1 Like

Am I correct to suspect that objective function used in the JuMP docs is wrong?

Sure. This is example is very old, so I assume it was just added with the trace as the relaxation because we didn’t have LogDet support at the time.

Is there a way to get JuMP/SCS to accept a logdet term in the objective function?

Based on their transformation to root-det, you probably want something like:

model = Model()
@variable(model, X[1:2, 1:2], PSD)
@variable(model, t)
@objective(model, Max, t)
X_tri = [X[i, j] for i in 1:2 for j in i:2]
@constraint(model, [t; X_tri...] in MOI.RootDetConeTriangle(2))

If you get something working, you can edit this file https://github.com/jump-dev/JuMP.jl/blob/master/docs/src/tutorials/conic/min_ellipse.jl to make a pull request updating JuMP’s documentation.

1 Like

I got it working:

And submitted a PR:
https://github.com/jump-dev/JuMP.jl/pull/2945

2 Likes

Great! I’ll take a look.