# Convex.jl constraints not enforced

Hi,
I am using Convex.jl to solve a simple least squares optimization problem with some monotonicity constraints on my variable \phi.
The least squares objective is minimized to zero which should not be the case as my constraints enforce some regularization on \phi. When I check manually if the constraints are enforced in the solution I realize that they are indeed violated.
Any idea why this happens ?

using Convex, SCS

# Building a toy classification dataset:
n = 10
x = rand(n,3) # features
y = zeros(n, 3)
y[argmax(x + rand(n, 3), dims=2)] .= 1 # labels

# Fitting a function ϕ to the data:
ϕ = Variable(n, 3)

# Monotonic constraints on ϕ: dot(ϕ(u)-ϕ(v), u-v) ≥ 0
for i in 1:n, j in 1:n
add_constraint!(ϕ, dot(ϕ[i,:] - ϕ[j,:], x[i,:] - x[j,:]) >= 0)
end

# Least squares:
problem = minimize(sumsquares(ϕ - y))
solve!(problem, SCS.Optimizer)

# Checking that constraints are enforced:
ϕ_ = ϕ.value
minimum([dot(ϕ_[i,:] - ϕ_[j,:], x[i,:] - x[j,:]) for i in 1:n, j in 1:n]) # takes large negative value (-0.5)


Thanks a lot for your help !

This definitely seems like a bug. Strangely, when I call evaluate on the whole expression, the constraints don’t appear to be violated:

julia> minimum([dot(ϕ_[i,:] - ϕ_[j,:], x[i,:] - x[j,:]) for i in 1:n, j in 1:n]) # takes large negative value (-0.5)
-0.8348822102765905

julia> [evaluate(dot(ϕ[i,:] - ϕ[j,:], x[i,:] - x[j,:]) ) for i in 1:n, j in 1:n] |> minimum
-8.560665926549506e-12



So that makes me think the dot constraint isn’t being represented correctly, but the wrong-constraint that is being sent to the solver is indeed being respected.

Looking at the code, the definition looks wrong in the complex case. I dug in a bit more, and it seems like the dot is broadcasting incorrectly. Using i=3, j=1,

julia> broadcast(*, ϕ[i,:] - ϕ[j,:], x[i,:] - x[j,:]) |> evaluate
3×3 Matrix{Float64}:
0.120121  -0.120121  -2.40194e-18
0.165757  -0.165757  -3.31445e-18
0.11009   -0.11009   -2.20135e-18


So the constraint is summing that matrix instead of this vector:

julia> broadcast(*, vec(ϕ[i,:] - ϕ[j,:]), x[i,:] - x[j,:]) |> evaluate
3×1 Matrix{Float64}:
0.12012139413805409
-0.1657565515857819
-2.201350210535898e-18


This seems to stem from the fact that Convex is treating ϕ[i,:] like a row-vector instead of a column vector:

julia> evaluate(ϕ[i,:])
1×3 Matrix{Float64}:
1.0  2.75373e-11  2.75373e-11

julia> x[i,:]
3-element Vector{Float64}:
0.8494305019037138
0.3639400806886586
0.2650583316431161

2 Likes

Thanks a lot for looking into it, I changed the code to:
add_constraint!(ϕ, dot(vec(ϕ[i,:] - ϕ[j,:]), x[i,:] - x[j,:]) >= 0)
And it seems to work.

2 Likes