Hi,
I am using Convex.jl to solve a simple least squares optimization problem with some monotonicity constraints on my variable \phi.
The least squares objective is minimized to zero which should not be the case as my constraints enforce some regularization on \phi. When I check manually if the constraints are enforced in the solution I realize that they are indeed violated.
Any idea why this happens ?
using Convex, SCS
# Building a toy classification dataset:
n = 10
x = rand(n,3) # features
y = zeros(n, 3)
y[argmax(x + rand(n, 3), dims=2)] .= 1 # labels
# Fitting a function ϕ to the data:
ϕ = Variable(n, 3)
# Monotonic constraints on ϕ: dot(ϕ(u)-ϕ(v), u-v) ≥ 0
for i in 1:n, j in 1:n
add_constraint!(ϕ, dot(ϕ[i,:] - ϕ[j,:], x[i,:] - x[j,:]) >= 0)
end
# Least squares:
problem = minimize(sumsquares(ϕ - y))
solve!(problem, SCS.Optimizer)
# Checking that constraints are enforced:
ϕ_ = ϕ.value
minimum([dot(ϕ_[i,:] - ϕ_[j,:], x[i,:] - x[j,:]) for i in 1:n, j in 1:n]) # takes large negative value (-0.5)
Thanks a lot for your help !