How do I make sure my optimization problem doesn’t get NP-hard?

It’s quite easy to get some NP-hard instance of a problem, for example…

maximize ax1^2 + bx1x2 + cx1x3 + … somethingx2x3 + somethingx2x4 + …
Subject to all variables between some values and some values…
Alone is NP hard. Proof can be by forcing x1, x2, … to either -1 or 1 by setting the respective constraints (for x1^2, x2^2,…) to some large value, then, for each clause, force each condition to either 0 or 1 (or -1 for negative clause respectively, then set it so that only one clause check is active at a time, this makes an OR gate) , the SAT problem is then solved IFF the value is above certain threshold. Even approximating this problem to an arbitrary degree would solve the SAT problem.

The general case that’s somewhat solvable might be Semidefinite programming, but what are the constraints exactly? The imprecise definition can lead to NP-hard problem.

I’m not sure what your precise question is, but one guide to avoiding NP-hard problems is to ask: can I formulate the problem as a conic program, that is, a convex program over conic constraint sets.

2 Likes