AC Optimal Power Flow in Various Nonlinear Optimization Frameworks

Two small data files are included in rosetta-opf/data the rest are available in pglib-opf.

1 Like

Table updated with results from [01bcebdf] Nonconvex v2.0.3 [bf347577] and NonconvexIpopt v0.4.1, which resolved convergence issues on a number of cases.

1 Like

Feel free to update the results above for Nonconvex using NonconvexIpopt 0.4.2. There are some nice perf improvements there.

Thanks for the heads up! I think I’ll wait a while longer to re-run incase there are updates from other packages as well, including Julia v1.8 which seems pretty close.

Given it has been about 1 year since these benchmarks were last conducted and the ecosystem is continually improving, I thought it would be good to revisit this with Julia v1.9 and the latest version of all packages (details at the bottom).

Some preliminary observations:

  • Runtimes are generally stable (maybe slightly improved on average)
  • NonConvex now solves two cases than it did not previously case89_pegase, case118_ieee

Here are the latest runtime results,

Case Vars Cons JuMP NLPModels NonConvex Optim Optimization
case3_lmbd 24 28 1.84e-02 1.63e+01 5.28e+00 INF. 2.72e+00
case5_pjm 44 53 2.96e-02 2.24e-01 2.67e+00 INF. 3.76e-01
case14_ieee 118 169 2.54e-01 1.97e+01 5.07e+01 N.D. 1.01e+01
case24_ieee_rts 266 315 2.75e-01 4.46e+01 2.78e+02 INF. 2.50e+01
case30_ieee 236 348 3.21e-01 2.92e+01 2.69e+02 N.D. 2.35e+01
case30_as 236 348 2.67e-01 2.43e+01 2.67e+02 INF. 2.39e+01
case39_epri 282 401 1.84e-01 6.12e+01 1.66e+02 N.D. 3.00e+01
case57_ieee 448 675 3.81e-01 8.49e+01 4.01e+02 N.D. 7.43e+01
case60_c 518 737 4.22e-01 1.93e+02 4.97e+02 N.D. 8.58e+01
case73_ieee_rts 824 987 4.75e-01 5.71e+02 1.60e+03 N.D. 1.98e+02
case89_pegase 1042 1649 8.57e-01 1.91e+03 6.42e+03 N.D. N.D.
case118_ieee 1088 1539 7.17e-01 1.78e+03 4.57e+03 N.D. N.D.
case162_ieee_dtc 1484 2313 9.27e-01 5.02e+03 N.D. N.D. N.D.
case179_goc 1468 2200 1.22e+00 N.D. N.D. N.D. N.D.
case200_activ 1456 2116 7.18e-01 2.99e+03 N.D. N.D. N.D.
case240_pserc 2558 3617 7.72e+00 N.D. N.D. N.D. N.D.
case300_ieee 2382 3478 1.45e+00 N.D. N.D. N.D. N.D.
case500_goc 4254 6097 2.79e+00 N.D. N.D. N.D. N.D.
case588_sdet 4110 5979 2.35e+00 N.D. N.D. N.D. N.D.
case793_goc 5432 7978 3.50e+00 N.D. N.D. N.D. N.D.
case1354_pegase 11192 16646 1.77e+01 N.D. N.D. N.D. N.D.
case1888_rte 14480 21494 5.75e+01 N.D. N.D. N.D. N.D.
case1951_rte 15018 22075 2.67e+01 N.D. N.D. N.D. N.D.
case2000_goc 19008 29432 1.88e+01 N.D. N.D. N.D. N.D.
case2312_goc 17128 25716 2.18e+01 N.D. N.D. N.D. N.D.
case2383wp_k 17004 25039 2.16e+01 N.D. N.D. N.D. N.D.
case2736sp_k 19088 28356 1.72e+01 N.D. N.D. N.D. N.D.
case2737sop_k 18988 28358 1.51e+01 N.D. N.D. N.D. N.D.
case2742_goc 24540 38196 8.26e+01 N.D. N.D. N.D. N.D.
case2746wp_k 19520 28446 1.89e+01 N.D. N.D. N.D. N.D.
case2746wop_k 19582 28642 1.54e+01 N.D. N.D. N.D. N.D.
case2848_rte 21822 32129 3.85e+01 N.D. N.D. N.D. N.D.
case2853_sdet 23028 33154 2.87e+01 N.D. N.D. N.D. N.D.
case2868_rte 22090 32393 4.53e+01 N.D. N.D. N.D. N.D.
case2869_pegase 25086 37813 3.84e+01 N.D. N.D. N.D. N.D.
case3012wp_k 21082 31029 2.65e+01 N.D. N.D. N.D. N.D.
case3022_goc 23238 34990 2.85e+01 N.D. N.D. N.D. N.D.
case3120sp_k 21608 32092 2.73e+01 N.D. N.D. N.D. N.D.
case3375wp_k 24350 35876 3.28e+01 N.D. N.D. N.D. N.D.
case3970_goc 35270 54428 4.52e+01 N.D. N.D. N.D. N.D.
case4020_goc 36696 56957 1.86e+02 N.D. N.D. N.D. N.D.
case4601_goc 38814 59596 7.01e+01 N.D. N.D. N.D. N.D.
case4619_goc 42532 66289 6.34e+01 N.D. N.D. N.D. N.D.
case4661_sdet 34758 51302 4.33e+01 N.D. N.D. N.D. N.D.
case4837_goc 41398 64030 6.76e+01 N.D. N.D. N.D. N.D.
case4917_goc 37872 56917 7.32e+01 N.D. N.D. N.D. N.D.
case6468_rte 49734 75937 1.58e+02 N.D. N.D. N.D. N.D.
case6470_rte 50482 75976 1.10e+02 N.D. N.D. N.D. N.D.
case6495_rte 50426 76124 1.81e+02 N.D. N.D. N.D. N.D.
case6515_rte 50546 76290 1.52e+02 N.D. N.D. N.D. N.D.
case8387_pegase 78748 118702 1.38e+02 N.D. N.D. N.D. N.D.
case9241_pegase 85568 130826 1.54e+02 N.D. N.D. N.D. N.D.
case9591_goc 83572 130588 2.68e+02 N.D. N.D. N.D. N.D.
case10000_goc 76804 112352 1.58e+02 N.D. N.D. N.D. N.D.
case10480_goc 96750 150874 3.10e+02 N.D. N.D. N.D. N.D.
case13659_pegase 117370 170588 2.04e+02 N.D. N.D. N.D. N.D.
case19402_goc 179562 281733 4.84e+02 N.D. N.D. N.D. N.D.
case24464_goc 203374 313641 3.65e+03 N.D. N.D. N.D. N.D.
case30000_goc 208624 307752 1.81e+03 N.D. N.D. N.D. N.D.

Package Details

[54578032] ADNLPModels v0.6.0
[b6b21f68] Ipopt v1.2.1
[4076af6c] JuMP v1.11.1
[961ee093] ModelingToolkit v8.46.1
[f4238b75] NLPModelsIpopt v0.10.1
[01bcebdf] Nonconvex v2.1.2
[bf347577] NonconvexIpopt v0.4.2
[429524aa] Optim v1.7.5
[7f7a1694] Optimization v3.14.2
[fd9f6733] OptimizationMOI v0.1.12
[c36e90e8] PowerModels v0.19.9

Ipopt was run with the default linear solver, MUMPS 5.4.1.

3 Likes

Note that this is not with the MTK structural simplification which should be merging tomorrow. It would be interesting to try it with that and see how much of an effect that makes, and then the SymbolicIR + GitHub - brianguenter/FastSymbolicDifferentiation.jl: Julia algorithm for creating efficient executables for evaluating derivatives and also computing symbolic derivatives version coming at JuliaCon will fix the scaling.

1 Like

I guess Carleton’s point is that this is what a user would experience if they installed each package today.

Carleton has been running these benchmarks over the year, and we’ll continue to run them, so we should be able to see the improvements when/if they land.

2 Likes

I want it as a development thing. I care about tracking this to see what the effect of the pieces are. I don’t care about “winning benchmarks” or something silly. I want to know whether SymbolicIR, Jacobian construction, or simplification are the piece that makes the biggest impact. If the benchmark is just re-run a year from now you’ll definitely get a better number but it won’t be informative as to next development steps, so I want @ccoffrin to know about these steps and track them here so there’s a clear history. It’s fairly rare that you get an opportunity to history track like this since benchmarks take a long time to write.

1 Like

I’ll do my best to re-run as major developments happen. The next update probably needs to be close to the end of the summer due to my other commitments, but maybe that is suitable if we are expecting some big improvements from JuliaCon related releases. Feel free to ping me if there is a notable development and re-run would be valuable, I’ll give it my best effort.

Spot checking any case at the 10, 100, 1000 bus scales is also a good way to check for progress without needing a full regression test.

2 Likes

Can we put this onto the SciMLBenchmarks system so we can auto-trigger it?

1 Like

Absolutely, the code is here, GitHub - lanl-ansi/rosetta-opf: AC-OPF Implementations in Various NLP Modeling Frameworks and the input data is here, GitHub - power-grid-lib/pglib-opf: Benchmarks for the Optimal Power Flow Problem

1 Like

Thanks for the update. I noticed that you say ADNLPModels 0.6.0 but the Manifest.toml has 0.3.3. Maybe I haven’t found the latest repo/branch?

It seems that Nonconvex is adding a ton of overhead. Thanks for maintaining these benchmarks. I will try to look into what’s happening there some time this year :slight_smile:

@Vaibhavdixit02

Sorry about that! I confirmed that the version that was used in this result table is ADNLPModels 0.6.0 but the manifest in the rosetta-opf repo was out of date. I have fixed the rosetta-opf repo to reflect the latest versions used in this table.

1 Like

Thanks! Just checking, no worries. We hope to get some improvements soon as well, but we’ll help update, if needed.

Latest results incorporating recent updates from NLPModels v0.7, which enable sparse Jacobians and Hessians.

Case Vars Cons JuMP NLPModels NonConvex Optim Optimization
case3_lmbd 24 28 1.09e-02 3.87e-02 4.57e+00 INF. 2.91e+00
case5_pjm 44 53 1.91e-02 9.77e-02 1.85e+00 INF. 3.29e-01
case14_ieee 118 169 2.29e-01 3.65e-01 4.86e+01 N.D. 9.16e+00
case24_ieee_rts 266 315 2.29e-01 9.26e-01 2.75e+02 INF. 2.40e+01
case30_ieee 236 348 2.55e-01 8.76e-01 2.58e+02 N.D. 2.42e+01
case30_as 236 348 2.26e-01 6.96e-01 2.66e+02 INF. 2.33e+01
case39_epri 282 401 9.87e-02 6.58e-01 1.50e+02 N.D. 2.86e+01
case57_ieee 448 675 2.75e-01 1.16e+00 3.81e+02 N.D. 6.87e+01
case60_c 518 737 2.88e-01 1.58e+00 4.64e+02 N.D. 8.81e+01
case73_ieee_rts 824 987 3.08e-01 2.26e+00 1.49e+03 N.D. 2.03e+02
case89_pegase 1042 1649 4.98e-01 5.90e+00 5.89e+03 N.D. N.D.
case118_ieee 1088 1539 4.36e-01 4.33e+00 4.52e+03 N.D. N.D.
case162_ieee_dtc 1484 2313 5.17e-01 6.19e+00 N.D. N.D. N.D.
case179_goc 1468 2200 5.79e-01 9.42e+00 N.D. N.D. N.D.
case197_snem 1608 2397 5.10e-01 6.41e+00 N.D. N.D. N.D.
case200_activ 1456 2116 4.44e-01 4.92e+00 N.D. N.D. N.D.
case240_pserc 2558 3617 2.78e+00 4.32e+01 N.D. N.D. N.D.
case300_ieee 2382 3478 6.85e-01 1.08e+01 N.D. N.D. N.D.
case500_goc 4254 6097 1.26e+00 2.72e+01 N.D. N.D. N.D.
case588_sdet 4110 5979 1.08e+00 2.16e+01 N.D. N.D. N.D.
case793_goc 5432 7978 1.52e+00 3.48e+01 N.D. N.D. N.D.
case1354_pegase 11192 16646 5.50e+00 1.57e+02 N.D. N.D. N.D.
case1803_snem 15246 23172 1.17e+01 3.13e+02 N.D. N.D. N.D.
case1888_rte 14480 21494 2.49e+01 4.97e+02 N.D. N.D. N.D.
case1951_rte 15018 22075 1.15e+01 3.17e+02 N.D. N.D. N.D.
case2000_goc 19008 29432 8.42e+00 3.89e+02 N.D. N.D. N.D.
case2312_goc 17128 25716 1.04e+01 2.98e+02 N.D. N.D. N.D.
case2383wp_k 17004 25039 8.82e+00 2.82e+02 N.D. N.D. N.D.
case2736sp_k 19088 28356 8.48e+00 2.97e+02 N.D. N.D. N.D.
case2737sop_k 18988 28358 9.38e+00 2.89e+02 N.D. N.D. N.D.
case2742_goc 24540 38196 3.29e+01 1.05e+03 N.D. N.D. N.D.
case2746wp_k 19520 28446 8.77e+00 2.88e+02 N.D. N.D. N.D.
case2746wop_k 19582 28642 9.10e+00 2.89e+02 N.D. N.D. N.D.
case2848_rte 21822 32129 1.73e+01 6.25e+02 N.D. N.D. N.D.
case2853_sdet 23028 33154 1.17e+01 7.63e+02 N.D. N.D. N.D.
case2868_rte 22090 32393 2.06e+01 6.97e+02 N.D. N.D. N.D.
case2869_pegase 25086 37813 1.69e+01 7.92e+02 N.D. N.D. N.D.
case3012wp_k 21082 31029 1.08e+01 4.04e+02 N.D. N.D. N.D.
case3022_goc 23238 34990 1.23e+01 7.25e+02 N.D. N.D. N.D.
case3120sp_k 21608 32092 1.21e+01 4.23e+02 N.D. N.D. N.D.
case3375wp_k 24350 35876 1.40e+01 6.09e+02 N.D. N.D. N.D.
case3970_goc 35270 54428 2.49e+01 1.88e+03 N.D. N.D. N.D.
case4020_goc 36696 56957 2.41e+01 1.98e+03 N.D. N.D. N.D.
case4601_goc 38814 59596 2.57e+01 2.28e+03 N.D. N.D. N.D.
case4619_goc 42532 66289 2.56e+01 2.32e+03 N.D. N.D. N.D.
case4661_sdet 34758 51302 2.61e+01 1.39e+03 N.D. N.D. N.D.
case4837_goc 41398 64030 2.69e+01 2.19e+03 N.D. N.D. N.D.
case4917_goc 37872 56917 2.35e+01 2.03e+03 N.D. N.D. N.D.
case5658_epigrids 48552 74821 2.64e+01 3.23e+03 N.D. N.D. N.D.
case6468_rte 49734 75937 9.30e+01 3.98e+03 N.D. N.D. N.D.
case6470_rte 50482 75976 4.08e+01 3.52e+03 N.D. N.D. N.D.
case6495_rte 50426 76124 7.56e+01 4.58e+03 N.D. N.D. N.D.
case6515_rte 50546 76290 5.83e+01 3.86e+03 N.D. N.D. N.D.
case7336_epigrids 62116 95306 3.31e+01 5.16e+03 N.D. N.D. N.D.
case8387_pegase 78748 118702 6.26e+01 N.D. N.D. N.D. N.D.
case9241_pegase 85568 130826 6.49e+01 N.D. N.D. N.D. N.D.
case9591_goc 83572 130588 8.18e+01 N.D. N.D. N.D. N.D.
case10000_goc 76804 112352 6.61e+01 N.D. N.D. N.D. N.D.
case10192_epigrids 89850 139456 7.39e+01 N.D. N.D. N.D. N.D.
case10480_goc 96750 150874 7.66e+01 N.D. N.D. N.D. N.D.
case13659_pegase 117370 170588 7.37e+01 N.D. N.D. N.D. N.D.
case19402_goc 179562 281733 1.55e+02 N.D. N.D. N.D. N.D.
case20758_epigrids 179236 274918 1.02e+02 N.D. N.D. N.D. N.D.
case24464_goc 203374 313641 1.50e+02 N.D. N.D. N.D. N.D.
case30000_goc 208624 307752 3.27e+02 N.D. N.D. N.D. N.D.
case78484_epigrids 674562 1039062 8.00e+02 N.D. N.D. N.D. N.D.

Package Details

[54578032] ADNLPModels v0.7.0
[f6369f11] ForwardDiff v0.10.35
[b6b21f68] Ipopt v1.4.1
[4076af6c] JuMP v1.12.0
[961ee093] ModelingToolkit v8.63.0
[f4238b75] NLPModelsIpopt v0.10.1
[01bcebdf] Nonconvex v2.1.2
[bf347577] NonconvexIpopt v0.4.2
[429524aa] Optim v1.7.6
[7f7a1694] Optimization v3.15.2
[fd9f6733] OptimizationMOI v0.1.14
[c36e90e8] PowerModels v0.19.9
[0c5d862f] Symbolics v5.5.0

Ipopt was run with the linear solver HSL ma27.

5 Likes

I’m surprised/not surprised by what a difference the sparsity made. The 200 bus case when from 3000 seconds to 5 :smile:.

The 78k bus system is also quite a large step-up in size from the 30k.

1 Like

Indeed! I should have also mentioned that I updated the table to the latest version of the PGLib-OPF benchmark library, v23.07. This library is actively growing to scales above 30k because industry would like to solve AC-OPF problems with as many as 150k buses… :sob:

It looks like we have nearly linear scaling though? 150k buses ~30 minutes? I guess they want <5min solves.