JuMP dual model formulation and impact on performance

For context:

I have a SDP problem that is formulated in the dual form (I don’t think its possible to formulate it in the primal form), for the solver to be able to optimize it I use Dualization.jl and apply it to the solver: model = Model(Dualization.dual_optimizer(Mosek.Optimizer)).

The optimzation problem is solved with an algorithm that iteratively updates the model’s objective function and solves it.

model = build_model(input_data) # Build variables and constraints

while no_convergence
    @objective(model, Min, ....) # Update objective function
    # Do some computations with results and check convergence

The algorithm works fine and returns the correct results, but when profiling it I found out that the “dualization” of the model takes a significant time of the computation time, roughly summarized in this flame graph. Each iteration takes about ~0.35 s.

Is there a method to pre-dualize the model and then add the “primal cost function” to the dual objective function? I tried with dualize(model) instead of setting the dual_optimizer, but the following error is thrown: ERROR: Constraints of funtion MathOptInterface.ScalarAffineFunction{Float64} in the Set MathOptInterface.Interval{Float64} are not implemented
(I’m not sure which constraints it is refering to)

1 Like

Currently, Dualization.jl does not support yet changing the objective function hence the dualized model is dropped and we need to start from scratch at optimize.
This is done silently as you use AUTOMATIC mode. You can try MANUAL mode so that you get an error instead of the the model being silently dropped; see http://www.juliaopt.org/JuMP.jl/v0.21.1/solvers/#Automatic-and-Manual-modes-1.
Could you open an issue on Dualization.jl asking to implement modifying the objective function for DualOptimizer?