Hi! I’m Fred, one of the taco developers. Also glad taco is mentioned here! I love Julia, and am really excited by the work Peter is doing.
The 100x number is taken from our academic paper and is the speedup we got over the Matlab tensor toolbox for several kernels, which as far as we know is the only system that supports the full tensor algebra with sparse tensors. We also compare many kernels that someone has hand-optimized, e.g. in Intel MKl, and for those our performance is comparable to the hand-optimized versions.
Our edge is that we compile down each expression to a custom kernel optimized for that particular expression. We also do this for compound expressions with many sub-expressions. So for the expressions nobody has hand-coded yet, you can still get hand-coded performance using taco.
We think Julia and taco are a great fit. Programmers can develop with Julia, which is really nice, and taco can run in the background, transparently generating custom kernels to compute linear and tensor expressions.