Regarding the number of variables (continuos and/or integers) and constraints, is there any consensus if using a BnB algorithm to solve a MINLP is viable or not?
For example, i’m solving a problem that has around 7700 variables (wich of 12 are integers) and 7700 constraints. Depeding on some parameters i use, it takes between 2h and 6h to solve this problem. But, i’ll need to test my code in another similar problem wich has much more variables and constraints, and if Couenne can solve that, i’m afraid it will take days. So, i’m not sure if BnB is the best (or even acceptable) approach here…
Hmm, that’s not very user-friendly right now. The couenne.opt file needs to be in the same directory as the .nl file that’s passed to the couenne executable. But the .nl file is generated in a different directory, namely in AmplNLWriter.solverdata_dir (CouenneNL is just a thin wrapper around AmplNLWriter). So I think if you put your couenne.opt file there it should work. Improving this situation, e.g. by making it so that you can directly specify in which directory AmplNLWriter generates the .nl file, could be a good contribution to the AmplNLWriter package.
Yes. I just put the “couenne.opt” file in the same folder as the files i’m running, it worked. Couenne stoped using those bound tightening options and exited earlier because of the gap i put in there (it even shows a message saying that).