Help rewriting Convex.jl problem in JuMP.jl

question
jump

#1

I am having a hard time trying to translate a convex optimization problem from Convex.jl to JuMP.jl:

using Convex, ECOS

# generate some data
n = 100
e = [sin(50i)*sin(50j) for i=1:n, j=1:n]

# optimization variable
h = Variable(n, n)

# objective
C = sum(pos(h[i,j] - e[i,j]) + pos(e[i,j] - h[i,j]) for i=1:n, j=1:n)

prob = minimize(C)
solve!(prob, ECOSSolver())

My attempt as a first-time user was as follows:

using JuMP, ECOS

# generate some data
n = 100
e = [sin(50i)*sin(50j) for i=1:n, j=1:n]

# define problem
m = Model(solver = ECOSSolver())

# optimization variable
@variable(m, h[1:n,1:n])

# objective
C = sum(max(0,h[i,j]-e[i,j]) + max(0,e[i,j]-h[i,j]) for i=1:n, j=1:n)

@objective(m, Min, C)
solve(m)

The code doesn’t compile and gives me the following error:

MethodError: no method matching isless(::JuMP.GenericAffExpr{Float64,JuMP.Variable}, ::Int64)
Closest candidates are:
isless(::Char, ::Integer) at deprecated.jl:49
isless(::AbstractFloat, ::Real) at operators.jl:42
isless(::ForwardDiff.Dual{N,T<:Real}, ::Real) at /home/juliohm/.julia/v0.5/ForwardDiff/src/dual.jl:160

  1. Can you please explain what is the issue? Are we allowed to call Julia functions like max directly in JuMP.jl?

  2. I am also wondering why the Convex.jl version takes a while (more than 5 minutes) to return even though the solver takes less than a second to solve the problem. Could someone please elaborate on what is happening with the Convex.jl approach?


`max` constraints in JuMP
#2

No, you can’t call functions like max on JuMP variables or expressions.
You’ll have to manually apply the transformations that Convex.jl does
automatically.


#3

You mean rewrite the problem in LP form? Can you confirm that the following formulation is equivalent?

minimize sum(t)
subject to
t[i,j] >= 0
h[i,j] - e[i,j] <= t[i,j]
e[i,j] - h[i,j] <= t[i,j]

Also, I’d be happy to stick with Convex.jl if I could solve the slow parsing issue. Is there any workaround for that?


#4

Yes, the transformation works. I like Convex.jl though, I wish I could stick with it.


#5

PRs addressing any performance issue are welcomed.


#6

Well, if I knew what exactly is causing the performance issue to start with…


#7

Are you sure the two formulations are equivalent?

I restrict myself to 1 variable: I call d one of your differences h[i,j]-e[i,j]. Let us also assume for the sake of argument that there is an additional constraint d <= 0. If I don’t get you wrong, you say that

maximize max(0,d)
subject to
d <= 0

which has optimal objective value of 0, is equivalent to

maximize t
subject to
t >= 0
d <= t
d <= 0

which to me seems unbounded.


#8

@mzaffalon max(0,d) is convex, you cannot maximize it. Or more precisely, the maximum is infinity. Now, if you make the explicit constraint d <= 0 implicit to the objective, the max(0,d) becomes indeed 0 and you can maximize it.

I believe the formulation for the minimization problem I wrote is equivalent.


#9

You are right, I misread your post. Thank you for the explanation.