I’m wondering if anyone has seen this before. I could not find a discussion about it. It is rather curious and I would like to understand it better.
I benchmarked a reactor model with BenchmarkTools. Average time to solve is about 8.5sec. Then I rebuilt the system image and the average time to solve increased to about 10.5sec. I deleted and reinstalled Julia and the time went back down.
This was repeatable on one and four cores (the times differ but the relative increase are similar) and on v0.6.3 and v0.6.2 on on Intel CPU Windows 10 laptop. I started with the official binaries.
This model starts off with a saved Jacobian, does a single iteration and checks the error against a tolerance. If the tolerance is exceeded, the Jacobian is recalculated. Before the system image rebuild, it would not recalculate the Jacobian for the test case. Afterwards it would trigger a recalculation until I updated the saved Jacobian. After the reinstall, this Jacobian then started triggering recalculations. Reverting to the original Jacobian resolved this. The Jacobian recalculations are not included in the times above. They are quite expensive.
I’ll pull apart the results and the two Jacobians to find the differences, but I was not expecting different results due to a system image rebuild. Of course, I was also not expecting a slow down, so my expectations are suspect.
Any wisdom on where to start looking?