# How can lazy constraints be written in a *.lp file?

I need to convert the codes in *.lp file. But the lazy constraints can not be converted. Would you please help me? Thanks very much

``````using JuMP,CPLEX
T=Model(solver=CplexSolver());
#Sets-------------------------------------------------------------------------
totalu=4;
U=1:4;
totalV=5;
V=1:totalV;
#Parameters-------------------------------------------------------------------
d=[100	10	8	9	7;10	100	10	5	6;8	10	100	8	9;9	5	8	100	6;7	6	9	6	100];
#variables---------------------------------------------------------------------
@variable(T,x[V,V],Bin);
@variable(T,u[V]>=0);
#constrains---------------------------------------------------------------------
@constraint(T,c1[i in V ], sum(x[i,j] for j in V )==1);
@constraint(T,c2[j in V], sum(x[i,j] for i in V )==1);
#lazyconstraint----------------------------------------------------------------
function sbt(cb)
v=zeros(Int64,N)
n=zeros(Int64,N)
for i=1:totalV
v[i]=0
n[i]=round(sum(j*getvalue(x[i,j]) for j in 1:totalV))
end
sd=0
for i=1:totalV
if v[i]==0
t=[]
ff=i
global t
condition=false
while condition==false
push!(t,ff)
v[ff]=1
ff=n[ff]
if ff==i
condition=true
end
end
sze=length(t)
if sze<totalV
@lazyconstraint(cb, sum(x[j,k] for j in t, k in t)<=sze-1)
sd=sd+1
end
end
end
end
# objective function------------------------------------------------------------
ff=sum(d[i,j]*x[i,j] for i in V,j in V);
@objective(T, Min, ff);
solve(T)
``````

JuMP does not support writing lazy constraints to a LP file, and there are no work-arounds. Why do you need this?

1 Like

Thanks very much,
I have to compare my model which is solved with an exact bi objective method with a valuable branch and bound method that its codes is available in github .But the model codes should be in *.lp files that it be possible to use those codes.

You could try to write an *.lp file of the model before solving it. In the callback in this if statement:

``````if sze<totalV
@lazyconstraint(cb, sum(x[j,k] for j in t, k in t)<=sze-1)
sd=sd+1
end
``````

open the *.lp file, and append the cut to the file using the `print()` function; then close the file. This is probably not very efficient but it should work.

Note: I think in older JuMP versions there was no guarantee that the cuts you try to add are really accepted by CPLEX. If this is also the current case it means that your *.lp file may contain cuts which you tried to add but which have not been accepted by CPLEX.

1 Like

Another possibility is to keep a copy of the original model, and to modify your callback so that, every time you submit a lazy constraint to the original model (which is being solved), you add it as a â€śregularâ€ť constraint to the copy.

Once the optimization is done, the copied model contains all the constraints that were added within a callback, and you can write it to an `.lp` file. This might be more efficient and less error-prone than dynamically modifying the `.lp` file within the callback.

As pointed out above: CPLEX may or may not accept your lazy constraints. The only guarantee is that, if a solution is rejected, it will not be visited again. In particular, there is no guarantee that CPLEX will keep the lazy constraints in the linear relaxation.

1 Like

Thanks very much for your kindly help.
If the version of JuMP is changed, this problem will be solved?

Iâ€™m really grateful.
As you mentioned, It is not possible use the lazy constraint for elimination subtours? I used this for an inventory routing problem to solve the model with more nodes as well as faster, the model worked, but it has many instances with different nodes, vehicles and periods. then, is it possible that the constraints not added completely when the instances are solving? and gives wrong solutions?

The only guarantee is that the final solution returned by the solver will respect all lazy constraints that have been added.

This is the case in all versions of JuMP.

1 Like

Iâ€™m very thankful,
How is t possible? is the following sentence correct?
the model with lazy constraints works in small scale consequently it works in large scales too.

How is t possible?

This is how most MIP solvers deal with lazy constraints.

The only guarantee is that the final solution returned by the solver will respect all lazy constraints that have been added.

The small print is even more specific: the Gurobi docs state that

Your callback should be prepared to cut off solutions that violate any of your lazy constraints, including those that have already been added. Node solutions will usually respect previously added lazy constraints, but not always.

(I added the emphasis on the â€śnot alwaysâ€ť). In fact, the only thing that Gurobi really guarantees is that it will give you the opportunity to reject a solution before it is accepted as incumbent, by providing Gurobi with a violated constraint. As far as I know, CPLEX works in a similar way.

In other words, if you submit a violated constraint, Gurobi guarantees that it will not accept the solution. However, Gurobi gives you no guarantee that the next incumbent candidate will satisfy that constraint. This is why the docs state that your callback should [â€¦] cut off solutions that violate any of your lazy constraints, including those that have already been added.

How does all this impact you? Say you record all the lazy constraints added by your callback, and save `.lp` file that contains the original model + all the lazy constraints. Then,

• If you load that `.lp` file afterwards and solve it (without any callback), the resulting solution may violate other constraints that were not seen in the initial run.
For instance, if youâ€™re solving TSP with adding sub-tour elimination constraints in a callback, you have no guarantee that the solution from solving the `.lp` file (and just the `.lp` file) will statisfy all sub-tour elimination constraints: only the ones that were seen in the initial solve and showed up in the `.lp` file.
• Loading the `.lp` file and solving it directly may be much slower than the initial solve (because of all the additional constraints that stay in the formulation), or it may be much faster (because those constraints take you straight to the optimum). Thereâ€™s no telling until youâ€™ve run it.
• Intermediate results like the LP relaxation may be drastically different. In general, adding more constraints to the problem will strengthen the LP relaxation, but it will also affect presolve, cut-generation, heuristics, etcâ€¦ with unclear effects in terms of performance.
1 Like