I need to solve large but very sparse SDPs with `COSMO.jl`

. The following code runs out of memory while constructing the linear matrix inequality (i.e., the PSD constraint):

```
using JuMP, COSMO
using LinearAlgebra
using SparseArrays
n = 8000
b = ones(n)
p = 0.01
C = sprandn(Float64, n, n, p)
model = JuMP.Model(optimizer_with_attributes(COSMO.Optimizer,
"merge_strategy" => COSMO.CliqueGraphMerge,
"decompose" => true,
"complete_dual" => true
))
@variable(model, y[1:n])
# # Try two things: 1. Symmetric() on the constraint:
@constraint(model, Symmetric(C - spdiagm(y)) >= 0, PSDCone())
# 2. Defining the variable explicitly and then constraining it
# @variable(model, Z[1:n, 1:n], PSD)
# @constraint(model, Z == Symmetric(C - spdiagm(y)))
@objective(model, Max, b'y)
JuMP.optimize!(model)
```

I have tried constructing the constraint with the commented out section as well, and both times the process is killed because it runs out of memory (I’m on a machine with 32GB of RAM running Ubuntu 22.04).

Interestingly, when I decrease `n`

to 4,000, the memory usage spikes during the construction of the PSD constraint but drops once COSMO is actually solving the problem.

Does anyone have any experience setting up problems of this size? Note that the random constraint matrix `C`

is very sparse (1% non-zero entries). Is JuMP suitable for this type of problem?