Huge performance hit in 0.6.0 for sparse matrix concatenation


I finally ported my code from 0.5.2 to 0.6.0 and discovered that a test run that took 8h35min in 0.5.2 now takes 15hr in 0.6.0. I have traced the performance problem to the following statement, which takes over 7 times longer in 0.6.0 than in 0.5.2:

    bobhess = [hessxx   hessxs   hessxt
               hessxs'  hessss   hessst
               hessxt'  hessst'  hesstt]

in which all the entries hessxx, …, hesstt are sparse matrices or vectors. Is this a known issue? (If not, I’ll open one.) Is there a workaround?


A minimal working example would be helpful.


Based on the response to my posting, I conclude that the issue raised is not previously known, so I have opened an issue with a test example:


We really really need PkgBenchmark.jl or something similar working smoothly to make sure that all our efforts to optimize code as package writers aren’t thrown away in future releases of the language. I’ve experienced slow downs on my packages as well from Julia v0.5 to Julia v0.6, but couldn’t nail down the sources given my time constraints.


PkgBenchmark just has a few convenience function over BenchmarkTools. Defining a benchmark suite and running it on two Julia releases and comparing the results is super easy just using BenchmarkTools.

As to, PkgBenchmark, a lot of stuff has happened on master. Feel free to try it out.


Good news-- the cause of my performance hit was identified (refer to the Julia-issue link above), and a workaround was proposed by Fredrik Ekre, namely, avoid concatenating SparseMatrixCSC and SparseVector. (Change the SparseVector to a SparseMatrixCSC first). My code ran for 8hr35 in 0.5.2, 15hr in 0.6.0 before I implemented the workaround, and now 5hr48 in 0.6.0 after I implemented the workaround. So overall, a noticeable performance boost from upgrading to 0.6.0.


There is also a PR that fixes the problem that will likely be backported to 0.6.1.


@kristoffer.carlsson thank you for your work in PkgBenchmark.jl, I will give it a second try when I find the time. Is there a new version tagged already with the fixes?


No, because I don’t think it has gotten enough testing to be tagged yet.