Hi all,

I’m doing some optimization on weighted graphs, and one of my constraints is showing up as quadratic for some reason, and JuMP says it’s can’t convert it to Affine, even though the docs say this is possible. (“Because solvers can take advantage of the knowledge that a constraint is quadratic, prefer adding quadratic constraints using constraint, rather than NLconstraint.” Linked here: ) Additionally, I’m not sure why JuMP thinks it’s quadratic; I’m not multiplying or dividing my two variables, just adding an inequality constraint between them.

The model in question is as follows, with some missing constraints for brevity. The only one that fails is the one surrounded by asterisks; the rest have been tested and work fine from what I can tell.

```
model=Model(HiGHS.Optimizer)
flows=collect(Graphs.weights(graph))
@variable(model, ACR[1:nv(graph),1:nv(graph)], Int);
@variable(model, an[1:nv(graph)], Bin);
@constraint(model, ACR .>= 0)
**@constraint(model, (ACR .<= Diagonal(1 .- an)*flows))**
@constraint(model, sum(ACR)+dot(cn,an) <= B);
etc...
```

What I don’t understand is that the constraint is simply a decision matrix of integers (ACR) - a binary decision vector (an) embedded in another matrix along the diagonal* a constant matrix (flows). How is this quadratic, or any different than an x.>=y constraint? Also, is there any way to fix it?

Thanks in advance for any help anyone is able to provide here, I’m completely stumped.