For context:

I have a SDP problem that is formulated in the dual form (I don’t think its possible to formulate it in the primal form), for the solver to be able to optimize it I use Dualization.jl and apply it to the solver: `model = Model(Dualization.dual_optimizer(Mosek.Optimizer))`

.

The optimzation problem is solved with an algorithm that iteratively updates the model’s objective function and solves it.

```
model = build_model(input_data) # Build variables and constraints
while no_convergence
@objective(model, Min, ....) # Update objective function
optimize!(model)
# Do some computations with results and check convergence
end
```

The algorithm works fine and returns the correct results, but when profiling it I found out that the “dualization” of the model takes a significant time of the computation time, roughly summarized in this flame graph. Each iteration takes about ~0.35 s.

Is there a method to pre-dualize the model and then add the “primal cost function” to the dual objective function? I tried with `dualize(model)`

instead of setting the `dual_optimizer`

, but the following error is thrown: `ERROR: Constraints of funtion MathOptInterface.ScalarAffineFunction{Float64} in the Set MathOptInterface.Interval{Float64} are not implemented`

(I’m not sure which constraints it is refering to)