# Cholesky factorization in JuMP

I want to solve a SDP optimization problem using a nonlinear solver. Specifically I want to use Cholesky factorization to enforce positive definiteness of some decision variable A. That is I want to find a lower triangular matrix L such that A = LL’ and diag(L) > 0. How do I write these variables in JuMP?

What I have so far is the following

using JuMP
model = Model()
n = 5
@variable(model, A[1:n, 1:n])
@variable(model, L[1:n, 1:n]) # How to force this to be lower triangular?
@constraint(model, const1, A == L * L’)
@constraint(model, const2, minimum(diag(L)) > 0)

Using the example problem from: The correlation problem · JuMP

``````using JuMP
import Ipopt
import LinearAlgebra

function example_corr_sdp()
model = Model(Ipopt.Optimizer)
set_silent(model)

#     @variable(model, X[1:3, 1:3], PSD)
# do
@variable(model, L[1:3, 1:3], Symmetric)
@constraint(model, [i=1:3], L[i, i] >= 0)
l = LinearAlgebra.LowerTriangular(L)
@expression(model, X, l * l')

# Diagonal is 1s
@constraint(model, X[1, 1] == 1)
@constraint(model, X[2, 2] == 1)
@constraint(model, X[3, 3] == 1)
# Bounds on the known correlations
@constraint(model, X[1, 2] >= -0.2)
@constraint(model, X[1, 2] <= -0.1)
@constraint(model, X[2, 3] >= 0.4)
@constraint(model, X[2, 3] <= 0.5)
# Find upper bound
@objective(model, Max, X[1, 3])
optimize!(model)
println("An upper bound for X[1, 3] is \$(value(X[1, 3]))")
# Find lower bound
@objective(model, Min, X[1, 3])
optimize!(model)
println("A lower bound for X[1, 3] is \$(value(X[1, 3]))")
return
end
example_corr_sdp()
``````

Why do you want to use a nonlinear solver though? There might be numerical issues with the quadratic equality constraint (which is non-convex).

@odow Is there any necessary reason to require L to be symmetric (unless it is in the problem statement)?