Dear JUMP Community,
I am solving a Huber regression problem. I am using the following second-order conic reformulation of the problem (from MOSEK tutorials):
My implementation is pretty straightforward:
function huber_regression(X, y; sigma = 1.0)
N, n = size(X)
model = Model() #start the model
set_optimizer(model, MosekTools.Optimizer) #call MOSEK
@variables(model, begin
u[1:N] >= 0.0
v[1:N] >= 0.0
beta_0
beta[1:n]
t #aux for conic constraint
end)
#define constraints
@constraints(model, begin
-(u + v) .<= beta_0 .+ X*beta .- y
(u + v) .>= beta_0 .+ X*beta .- y
u .<= sigma
[(1+t)/2; [(1-t)/2; u]] in SecondOrderCone()
end)
@objective(model, Min, (t + 2*sigma*sum(v)))
JuMP.optimize!(model)
status, time, obj = termination_status(model), JuMP.solve_time(model), objective_value(model)
return status, obj, time, value.(model[:beta_0]), value.(model[:beta])
end
However, unfortunately, I get the status “SLOW_PROGRESS” often times. The data does not have ill-conditions, and this status keeps coming under simulated or real-life data. Is there a problem at my implementation (it gives the true objective value – I verified it, but maybe there is something I can do for the efficiency?).
Thank you for your time!