Congrat’s @sshin23 that’s impressive new option!
I agree with the idea of renaming NPLModels as ADNLPModels, especially since ExaModels is also implementing the NLPModel API.
Congrat’s @sshin23 that’s impressive new option!
I agree with the idea of renaming NPLModels as ADNLPModels, especially since ExaModels is also implementing the NLPModel API.
I suggest to also specify which backend of BLAS / LAPACK is used for the benchmarks.
Is it OpenBLAS32_jll.jl?
Ipopt_jll.jl and HSL_jll.jl (from libHSL) are compiled with LBT and it could highly impact the performances is you use MKL.jl
or AppleAccelerate.jl
for example.
I checked the Manifest.toml
of rosetta-opf
and an old version of Ipopt (artifact Ipopt_jll.jl
) is used:
The lastest version is 300.1400.1400
.
The version 300.1400.400
is the last version supported by Julia 1.6 and it was compiled 2021.
@tmigot, I have updated NPLModels
to ADNLPModels
. This will be reflected in the next update to the table.
@amontoison, I forget exactly which dependency combination is holding the Ipopt_jll version down. @odow had looked into it once and may remember more details. For the linear solver, Ipopt_jll is pulling an HSL DLL that is installed separately from Julia, as per these instructions. I am interested to update to the latest best practices for using HSL in Julia, but probably need some help in getting it setup.
Ah. It’s because of OptimizationMOI. Let me take a look. They shouldn’t need that compat
@ccoffrin, you can change the BLAS backend in addition to HSL.
See GitHub - jump-dev/Ipopt.jl: Julia interface to the Ipopt nonlinear solver
This requires the updated Ipopt_jll, but it should “just” be a matter of adding using MKL
before using Ipopt
.
@ccoffrin
For the HSL linear solvers, you just need to download and install HSL_jll.jl now (require an academic licence).
Please take the version precompiled with LBT.
With a using HSL_jll
, you can direclty use the HSL linear solvers in Ipopt:
Actually, we probably can’t update anyway, because of:
That only impacts the variants Project.toml, not the main one right?
Now fixed in AmplNLWriter v1.2.1 and OptimizationMOI v0.4.0. I’ll make a PR
Thanks to recent contributions from @amontoison, there is a new version of the NLPModels models implementation which is much more scalable. It solves networks up to 30k buses in a 2 hour time limit. Thanks @amontoison!
While doing the regression test on NLPModels I updated to the latest versions of all packages. Looks like there are small performance improvements across the board.
Case | Vars | Cons | ExaModels | JuMP | ADNLPModels | NonConvex | Optimization |
---|---|---|---|---|---|---|---|
case3_lmbd | 24 | 28 | 1.13e-02 | 1.54e-02 | 1.50e-02 | 7.72e+00 | 3.10e+00 |
case5_pjm | 44 | 53 | 5.27e-02 | 6.70e-02 | 6.63e-02 | 1.70e+01 | 4.22e+00 |
case14_ieee | 118 | 169 | 4.02e-02 | 5.99e-02 | 6.93e-02 | 7.59e+01 | 8.87e+00 |
case24_ieee_rts | 266 | 315 | 6.60e-02 | 1.23e-01 | 2.53e-01 | 2.41e+02 | 1.79e+01 |
case30_ieee | 236 | 348 | 4.65e-02 | 7.52e-02 | 1.42e-01 | 2.48e+02 | 2.00e+01 |
case30_as | 236 | 348 | 5.95e-02 | 8.50e-02 | 1.26e-01 | 2.47e+02 | 1.88e+01 |
case39_epri | 282 | 401 | 4.27e-02 | 1.10e-01 | 2.88e-01 | 3.28e+02 | 2.09e+01 |
case57_ieee | 448 | 675 | 6.06e-02 | 1.01e-01 | 2.13e-01 | 5.38e+02 | 4.25e+01 |
case60_c | 518 | 737 | 5.30e-02 | 1.26e-01 | 3.65e-01 | 5.37e+02 | 4.45e+01 |
case73_ieee_rts | 824 | 987 | 1.00e-01 | 2.38e-01 | 5.48e-01 | 1.27e+03 | 8.53e+01 |
case89_pegase | 1042 | 1649 | 1.19e-01 | 3.34e-01 | 1.94e+00 | 3.89e+03 | 2.28e+02 |
case118_ieee | 1088 | 1539 | 1.18e-01 | 2.98e-01 | 1.35e+00 | 3.06e+03 | 1.63e+02 |
case162_ieee_dtc | 1484 | 2313 | 1.55e-01 | 3.51e-01 | 1.78e+00 | N.D. | 3.69e+02 |
case179_goc | 1468 | 2200 | 1.80e-01 | 4.53e-01 | 2.48e+00 | 5.53e+03 | 2.88e+02 |
case197_snem | 1608 | 2397 | 1.30e-01 | 3.24e-01 | 1.75e+00 | N.D. | 4.01e+02 |
case200_activ | 1456 | 2116 | 1.02e-01 | 2.75e-01 | 1.52e+00 | 6.51e+03 | 3.05e+02 |
case240_pserc | 2558 | 3617 | 8.52e-01 | 2.42e+00 | 1.33e+01 | N.D. | 9.38e+02 |
case300_ieee | 2382 | 3478 | 2.19e-01 | 5.70e-01 | 3.46e+00 | N.D. | 7.65e+02 |
case500_goc | 4254 | 6097 | 4.14e-01 | 1.10e+00 | 7.85e+00 | N.D. | 3.29e+03 |
case588_sdet | 4110 | 5979 | 3.45e-01 | 8.93e-01 | 6.87e+00 | N.D. | 2.27e+03 |
case793_goc | 5432 | 7978 | 4.68e-01 | 1.25e+00 | 1.20e+01 | N.D. | 4.68e+03 |
case1354_pegase | 11192 | 16646 | 1.21e+00 | 3.94e+00 | 5.45e+01 | N.D. | N.D. |
case1803_snem | 15246 | 23172 | 2.15e+00 | 7.19e+00 | 8.72e+01 | N.D. | N.D. |
case1888_rte | 14480 | 21494 | 4.41e+00 | 1.75e+01 | 2.12e+02 | N.D. | N.D. |
case1951_rte | 15018 | 22075 | 2.43e+00 | 8.52e+00 | 1.11e+02 | N.D. | N.D. |
case2000_goc | 19008 | 29432 | 1.94e+00 | 7.22e+00 | 8.20e+01 | N.D. | N.D. |
case2312_goc | 17128 | 25716 | 1.85e+00 | 6.42e+00 | 8.53e+01 | N.D. | N.D. |
case2383wp_k | 17004 | 25039 | 2.18e+00 | 7.37e+00 | 7.24e+01 | N.D. | N.D. |
case2736sp_k | 19088 | 28356 | 1.88e+00 | 6.66e+00 | 6.95e+01 | N.D. | N.D. |
case2737sop_k | 18988 | 28358 | 1.64e+00 | 6.27e+00 | 6.33e+01 | N.D. | N.D. |
case2742_goc | 24540 | 38196 | 7.11e+00 | 2.51e+01 | 2.51e+02 | N.D. | N.D. |
case2746wp_k | 19520 | 28446 | 1.84e+00 | 6.40e+00 | 7.88e+01 | N.D. | N.D. |
case2746wop_k | 19582 | 28642 | 1.68e+00 | 5.92e+00 | 7.02e+01 | N.D. | N.D. |
case2848_rte | 21822 | 32129 | 3.72e+00 | 1.35e+01 | 1.66e+02 | N.D. | N.D. |
case2853_sdet | 23028 | 33154 | 2.62e+00 | 9.34e+00 | 2.18e+02 | N.D. | N.D. |
case2868_rte | 22090 | 32393 | 3.88e+00 | 1.44e+01 | 1.92e+02 | N.D. | N.D. |
case2869_pegase | 25086 | 37813 | 3.34e+00 | 1.15e+01 | 1.68e+02 | N.D. | N.D. |
case3012wp_k | 21082 | 31029 | 2.83e+00 | 9.54e+00 | 1.02e+02 | N.D. | N.D. |
case3022_goc | 23238 | 34990 | 2.90e+00 | 1.01e+01 | 1.87e+02 | N.D. | N.D. |
case3120sp_k | 21608 | 32092 | 2.62e+00 | 1.03e+01 | 9.73e+01 | N.D. | N.D. |
case3375wp_k | 24350 | 35876 | 3.25e+00 | 1.44e+01 | 1.49e+02 | N.D. | N.D. |
case3970_goc | 35270 | 54428 | 6.42e+00 | 1.60e+01 | 3.60e+02 | N.D. | N.D. |
case4020_goc | 36696 | 56957 | 8.95e+00 | 2.02e+01 | 2.74e+02 | N.D. | N.D. |
case4601_goc | 38814 | 59596 | 7.86e+00 | 2.26e+01 | 3.90e+02 | N.D. | N.D. |
case4619_goc | 42532 | 66289 | 7.97e+00 | 2.08e+01 | 2.93e+02 | N.D. | N.D. |
case4661_sdet | 34758 | 51302 | 5.71e+00 | 1.62e+01 | 2.40e+02 | N.D. | N.D. |
case4837_goc | 41398 | 64030 | 6.51e+00 | 2.10e+01 | 3.13e+02 | N.D. | N.D. |
case4917_goc | 37872 | 56917 | 5.58e+00 | 1.81e+01 | 4.15e+02 | N.D. | N.D. |
case5658_epigrids | 48552 | 74821 | 7.65e+00 | 2.26e+01 | 4.03e+02 | N.D. | N.D. |
case6468_rte | 49734 | 75937 | 1.64e+01 | 6.50e+01 | 7.83e+02 | N.D. | N.D. |
case6470_rte | 50482 | 75976 | 1.05e+01 | 3.42e+01 | 5.71e+02 | N.D. | N.D. |
case6495_rte | 50426 | 76124 | 1.82e+01 | 5.59e+01 | 9.21e+02 | N.D. | N.D. |
case6515_rte | 50546 | 76290 | 1.52e+01 | 4.85e+01 | 7.53e+02 | N.D. | N.D. |
case7336_epigrids | 62116 | 95306 | 9.94e+00 | 2.72e+01 | 5.24e+02 | N.D. | N.D. |
case8387_pegase | 78748 | 118702 | 1.69e+01 | 4.57e+01 | 1.72e+03 | N.D. | N.D. |
case9241_pegase | 85568 | 130826 | 1.70e+01 | 4.92e+01 | 1.96e+03 | N.D. | N.D. |
case9591_goc | 83572 | 130588 | 2.66e+01 | 5.82e+01 | 9.83e+02 | N.D. | N.D. |
case10000_goc | 76804 | 112352 | 1.63e+01 | 4.87e+01 | 9.10e+02 | N.D. | N.D. |
case10192_epigrids | 89850 | 139456 | 2.07e+01 | 5.64e+01 | 1.26e+03 | N.D. | N.D. |
case10480_goc | 96750 | 150874 | 2.91e+01 | 6.26e+01 | 1.26e+03 | N.D. | N.D. |
case13659_pegase | 117370 | 170588 | 2.18e+01 | 5.88e+01 | 2.86e+03 | N.D. | N.D. |
case19402_goc | 179562 | 281733 | 7.19e+01 | 1.39e+02 | 4.04e+03 | N.D. | N.D. |
case20758_epigrids | 179236 | 274918 | 3.78e+01 | 8.88e+01 | 4.39e+03 | N.D. | N.D. |
case24464_goc | 203374 | 313641 | 5.02e+01 | 1.21e+02 | 5.23e+03 | N.D. | N.D. |
case30000_goc | 208624 | 307752 | 9.67e+01 | 2.31e+02 | 7.01e+03 | N.D. | N.D. |
case78484_epigrids | 674562 | 1039062 | 3.54e+02 | 7.25e+02 | N.D. | N.D. | N.D. |
Julia v1.10.0
[54578032] ADNLPModels v0.8.2
[2569d6c7] ConcreteStructs v0.2.3
[1037b233] ExaModels v0.7.1
[f6369f11] ForwardDiff v0.10.36
[b6b21f68] Ipopt v1.6.3
[4076af6c] JuMP v1.22.2
[961ee093] ModelingToolkit v9.19.0
[f4238b75] NLPModelsIpopt v0.10.2
[01bcebdf] Nonconvex v2.1.3
[bf347577] NonconvexIpopt v0.4.3
[429524aa] Optim v1.9.4
[7f7a1694] Optimization v3.26.1
[fd9f6733] OptimizationMOI v0.4.2
[c36e90e8] PowerModels v0.21.1
Ipopt was configured to run with the linear solver HSL ma27.
@ccoffrin how much more scalable is it? The improvements here are due to code written by @hill and me which ADNLPModels.jl now relies on, and we’re currently writing a paper about our creations, so I’m very curious!
See SparseConnectivityTracer.jl and SparseMatrixColorings.jl for details.
Last run was AC Optimal Power Flow in Various Nonlinear Optimization Frameworks - #81 by ccoffrin
So roughly order of magnitude?
That’s amazing!
With the upcoming improvements to SparseConnectivityTracer.jl, I wouldn’t be surprised if we could gain some more.
More updates from @amontoison on ADNLPModels reduced the solve time of this framework by ~2x.
Also updated to the latest compatible version of all packages but I dont see any notable changes.
Case | Vars | Cons | ExaModels | JuMP | ADNLPModels | NonConvex | Optimization |
---|---|---|---|---|---|---|---|
case3_lmbd | 24 | 28 | 9.72e-03 | 1.28e-02 | 1.38e-02 | 7.84e+00 | 2.93e+00 |
case5_pjm | 44 | 53 | 5.26e-02 | 8.12e-02 | 6.12e-02 | 1.73e+01 | 4.01e+00 |
case14_ieee | 118 | 169 | 4.05e-02 | 7.94e-02 | 6.62e-02 | 7.89e+01 | 8.18e+00 |
case24_ieee_rts | 266 | 315 | 6.51e-02 | 1.35e-01 | 1.30e-01 | 2.35e+02 | 1.80e+01 |
case30_ieee | 236 | 348 | 4.76e-02 | 8.60e-02 | 1.26e-01 | 2.45e+02 | 1.80e+01 |
case30_as | 236 | 348 | 5.87e-02 | 1.01e-01 | 1.06e-01 | 2.34e+02 | 1.81e+01 |
case39_epri | 282 | 401 | 4.20e-02 | 1.24e-01 | 2.49e-01 | 3.12e+02 | 2.24e+01 |
case57_ieee | 448 | 675 | 6.07e-02 | 1.16e-01 | 1.80e-01 | 5.41e+02 | 4.51e+01 |
case60_c | 518 | 737 | 5.33e-02 | 1.42e-01 | 2.53e-01 | 4.99e+02 | 4.37e+01 |
case73_ieee_rts | 824 | 987 | 9.75e-02 | 2.09e-01 | 3.67e-01 | 1.25e+03 | 8.89e+01 |
case89_pegase | 1042 | 1649 | 1.19e-01 | 3.28e-01 | 1.20e+00 | 3.78e+03 | 2.15e+02 |
case118_ieee | 1088 | 1539 | 1.11e-01 | 2.92e-01 | 6.63e-01 | 3.03e+03 | 1.68e+02 |
case162_ieee_dtc | 1484 | 2313 | 1.53e-01 | 3.43e-01 | 1.13e+00 | N.D. | 3.77e+02 |
case179_goc | 1468 | 2200 | 1.76e-01 | 4.43e-01 | 1.36e+00 | 5.54e+03 | 2.89e+02 |
case197_snem | 1608 | 2397 | 1.27e-01 | 3.35e-01 | 7.68e-01 | N.D. | 4.07e+02 |
case200_activ | 1456 | 2116 | 9.57e-02 | 2.71e-01 | 6.68e-01 | 6.27e+03 | 3.01e+02 |
case240_pserc | 2558 | 3617 | 8.58e-01 | 2.43e+00 | 7.73e+00 | N.D. | 9.03e+02 |
case300_ieee | 2382 | 3478 | 2.13e-01 | 6.42e-01 | 1.81e+00 | N.D. | 7.61e+02 |
case500_goc | 4254 | 6097 | 4.09e-01 | 1.37e+00 | 4.38e+00 | N.D. | 3.14e+03 |
case588_sdet | 4110 | 5979 | 3.70e-01 | 1.11e+00 | 3.07e+00 | N.D. | 2.72e+03 |
case793_goc | 5432 | 7978 | 4.54e-01 | 1.50e+00 | 5.46e+00 | N.D. | 5.03e+03 |
case1354_pegase | 11192 | 16646 | 1.19e+00 | 4.05e+00 | 2.10e+01 | N.D. | N.D. |
case1803_snem | 15246 | 23172 | 2.06e+00 | 7.67e+00 | 4.07e+01 | N.D. | N.D. |
case1888_rte | 14480 | 21494 | 4.41e+00 | 1.72e+01 | 1.05e+02 | N.D. | N.D. |
case1951_rte | 15018 | 22075 | 2.54e+00 | 8.70e+00 | 5.34e+01 | N.D. | N.D. |
case2000_goc | 19008 | 29432 | 1.89e+00 | 6.81e+00 | 3.99e+01 | N.D. | N.D. |
case2312_goc | 17128 | 25716 | 1.81e+00 | 6.27e+00 | 3.57e+01 | N.D. | N.D. |
case2383wp_k | 17004 | 25039 | 2.18e+00 | 6.94e+00 | 3.80e+01 | N.D. | N.D. |
case2736sp_k | 19088 | 28356 | 1.87e+00 | 5.78e+00 | 3.37e+01 | N.D. | N.D. |
case2737sop_k | 18988 | 28358 | 1.62e+00 | 5.29e+00 | 3.20e+01 | N.D. | N.D. |
case2742_goc | 24540 | 38196 | 7.15e+00 | 2.40e+01 | 1.24e+02 | N.D. | N.D. |
case2746wp_k | 19520 | 28446 | 1.85e+00 | 6.11e+00 | 3.57e+01 | N.D. | N.D. |
case2746wop_k | 19582 | 28642 | 1.72e+00 | 5.50e+00 | 3.37e+01 | N.D. | N.D. |
case2848_rte | 21822 | 32129 | 3.64e+00 | 1.29e+01 | 9.37e+01 | N.D. | N.D. |
case2853_sdet | 23028 | 33154 | 2.66e+00 | 9.01e+00 | 9.67e+01 | N.D. | N.D. |
case2868_rte | 22090 | 32393 | 4.01e+00 | 1.44e+01 | 8.67e+01 | N.D. | N.D. |
case2869_pegase | 25086 | 37813 | 3.31e+00 | 1.15e+01 | 1.05e+02 | N.D. | N.D. |
case3012wp_k | 21082 | 31029 | 2.76e+00 | 9.08e+00 | 4.77e+01 | N.D. | N.D. |
case3022_goc | 23238 | 34990 | 2.95e+00 | 1.04e+01 | 7.13e+01 | N.D. | N.D. |
case3120sp_k | 21608 | 32092 | 2.66e+00 | 8.89e+00 | 4.82e+01 | N.D. | N.D. |
case3375wp_k | 24350 | 35876 | 3.20e+00 | 1.09e+01 | 7.37e+01 | N.D. | N.D. |
case3970_goc | 35270 | 54428 | 6.44e+00 | 1.63e+01 | 1.38e+02 | N.D. | N.D. |
case4020_goc | 36696 | 56957 | 8.93e+00 | 2.20e+01 | 1.40e+02 | N.D. | N.D. |
case4601_goc | 38814 | 59596 | 7.92e+00 | 2.26e+01 | 1.72e+02 | N.D. | N.D. |
case4619_goc | 42532 | 66289 | 7.88e+00 | 1.95e+01 | 1.31e+02 | N.D. | N.D. |
case4661_sdet | 34758 | 51302 | 5.99e+00 | 1.62e+01 | 1.11e+02 | N.D. | N.D. |
case4837_goc | 41398 | 64030 | 6.70e+00 | 2.13e+01 | 1.41e+02 | N.D. | N.D. |
case4917_goc | 37872 | 56917 | 5.71e+00 | 1.96e+01 | 1.34e+02 | N.D. | N.D. |
case5658_epigrids | 48552 | 74821 | 7.64e+00 | 2.26e+01 | 1.96e+02 | N.D. | N.D. |
case6468_rte | 49734 | 75937 | 1.60e+01 | 5.73e+01 | 3.77e+02 | N.D. | N.D. |
case6470_rte | 50482 | 75976 | 1.05e+01 | 3.31e+01 | 2.58e+02 | N.D. | N.D. |
case6495_rte | 50426 | 76124 | 1.89e+01 | 6.02e+01 | 4.25e+02 | N.D. | N.D. |
case6515_rte | 50546 | 76290 | 1.51e+01 | 4.76e+01 | 3.43e+02 | N.D. | N.D. |
case7336_epigrids | 62116 | 95306 | 9.71e+00 | 2.62e+01 | 2.57e+02 | N.D. | N.D. |
case8387_pegase | 78748 | 118702 | 1.60e+01 | 4.54e+01 | 1.03e+03 | N.D. | N.D. |
case9241_pegase | 85568 | 130826 | 1.77e+01 | 4.90e+01 | 1.09e+03 | N.D. | N.D. |
case9591_goc | 83572 | 130588 | 2.67e+01 | 5.87e+01 | 4.67e+02 | N.D. | N.D. |
case10000_goc | 76804 | 112352 | 1.67e+01 | 4.51e+01 | 4.49e+02 | N.D. | N.D. |
case10192_epigrids | 89850 | 139456 | 2.10e+01 | 5.39e+01 | 5.96e+02 | N.D. | N.D. |
case10480_goc | 96750 | 150874 | 2.77e+01 | 6.91e+01 | 5.91e+02 | N.D. | N.D. |
case13659_pegase | 117370 | 170588 | 2.17e+01 | 5.91e+01 | 1.43e+03 | N.D. | N.D. |
case19402_goc | 179562 | 281733 | 7.15e+01 | 1.40e+02 | 1.73e+03 | N.D. | N.D. |
case20758_epigrids | 179236 | 274918 | 3.55e+01 | 8.97e+01 | 1.61e+03 | N.D. | N.D. |
case24464_goc | 203374 | 313641 | 5.02e+01 | 1.18e+02 | 2.50e+03 | N.D. | N.D. |
case30000_goc | 208624 | 307752 | 9.68e+01 | 2.26e+02 | 3.72e+03 | N.D. | N.D. |
case78484_epigrids | 674562 | 1039062 | 3.60e+02 | 7.64e+02 | N.D. | N.D. | N.D. |
Julia v1.10.0
[54578032] ADNLPModels v0.8.7
[2569d6c7] ConcreteStructs v0.2.3
[1037b233] ExaModels v0.7.1
[f6369f11] ForwardDiff v0.10.36
[b6b21f68] Ipopt v1.6.6
[4076af6c] JuMP v1.23.1
[961ee093] ModelingToolkit v9.32.0
[f4238b75] NLPModelsIpopt v0.10.2
[01bcebdf] Nonconvex v2.1.3
[bf347577] NonconvexIpopt v0.4.3
[429524aa] Optim v1.9.4
[7f7a1694] Optimization v3.28.0
[fd9f6733] OptimizationMOI v0.4.3
[c36e90e8] PowerModels v0.21.2
Ipopt was configured to run with the linear solver HSL ma27.
Thanks for the last update @ccoffrin!
I’m a little bit disappointed that we couldn’t solve the last problem in less than 2 hours.
That will be the goal for next time
I’m now convinced that the current bottleneck is ReverseDiff.jl
.
It doesn’t seem to scale well at all.
I want to replace it with Enzyme.jl
for the next iteration.
It seems quite stable and should theoretically offer another order of magnitude in terms of performance.
Let’s see the results in the future benchmarks.
@ccoffrin could you share where the instances come from? @hill and I are trying to run some benchmarks of our own on rosetta-opf
, but the data
folder only has 5 small instances. The repo’s README mentions MATPOWER as a source of instances but I fail to see where these instances could be. Any help will be appreciated.
In addition odow’s answer: instances from PGLib can be read into PowerModels dictionaries with the PGLib package.
How would you describe the importance of ACOPF to a machine learning audience? Besides the fact that it is nonlinear and sparse, which are the two ingredients we need for our paper ^^