Implementation of graphical lasso with Cholesky factor

Hello,

I am working on an implementation of the graphical Lasso (Glasso) (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3019769/pdf/kxm045.pdf) in pure Julia. In the algorithm, we have to use a square root of sample covariance matrix W11. I am thinking of using the Cholesky factor as it will speed-up calculations for the lasso optimization. I am getting error that W11 is not positive definite. How would you address this?

using SCS
using Convex
using Distributions
using LinearAlgebra
using Statistics 

function glasso(X, ρ; tol = 1e-2, Niter = 100, withθ = true)
    Nx, Ne = size(X)
    
    S = cov(X')
    
    W = S + ρ*I
    Wold = copy(W)
    W11 = zeros(Nx-1, Nx-1)
    b = zeros(Nx-1)
    B = zeros(Nx-1, Nx-1)
    w12 = zeros(Nx-1)
    
    
    
    # Iterate until convergence
    for i=1:Niter
        for j=1:Nx
            jminus = setdiff(1:Nx, j)
            
            # Compute a Cholesky factorization of W11
            W11 .= W[jminus, jminus]
            W11sqrt .= LowerTriangular(cholesky(Symmetric(W11)).L)
            # W11svd = svd(W11)
            # W11sqrt = W11svd.U * Diagonal(sqrt.(W11svd.S)) * W11svd.Vt #W_11^(1/2)
            
            b .= 0.5*(W11sqrt \ S[jminus,j]) # W_11^(-1/2) * s_12
            
            # Solve the lasso problem
            β_lasso = Variable(Nx-1)
            
            lasso_problem = minimize(sumsquares(W11sqrt*β_lasso - b) + ρ*norm(β_lasso, 1))
            
            
            # Solve the problem by calling solve!
            solve!(lasso_problem, SCS.Optimizer; silent_solver = true)
            
            # TODO check resolution status
            # @assert lasso_problem.status 
            
            w12 .= 2.0*(W11*evaluate(β_lasso))
            
            # Update W
            W[j, jminus] .= copy(w12)
            W[jminus, j] .= copy(w12)
        end
        
        # Stop criterion
        if norm(W - Wold,1) < tol
            break 
        end
        Wold = copy(W)
    end
    
    return W
#     if withθ == true
#         # Build precision matrix
#         θ = zeros(Nx, Nx)
#     end
end
Nx = 40
μX = zeros(Nx)
ΣX = PDiagMat(ones(Nx))
πX = Distributions.MvNormal(μX, ΣX)

Ne = 100
X = rand(πX, Ne)

W = glasso(X, 0.01)

https://discourse.julialang.org/t/glm-jl-logisticregression-errors-matrix-is-not-positive-definite-cholesky-factorization-failed

In the context of Gaussian processes, the usual fix is to add a small perturbation to the diagonal, i.e., in order to move the eigenvalues away from zero:

cholesky(Symmetric(A) + 1e-8 * I)

Another option is to use

cholesky(A; check = false)

but this might lead to other issues later on (never tried that before).