Has anyone done a performance comparison between Flux and Tensorflow, on the speed of training a model? I mean taking some conveniently pre-defined model on some standard dataset (say MNIST) and training it using both libraries to see which goes faster. Are there comparisons like these out there?
The most recent benchmarks I’ve found have shown Flux a bit faster:
though I would expect that not to be the case when deploying to large multi-GPU sessions, which is where TensorFlow shines. Though I do also know of
http://denizyuret.github.io/Knet.jl/latest/tutorial/#Benchmarks-1
By all accounts, KNet is one of the fastest out there (and notice the Knet benchmarks were not written or ran by the Knet authors. They weren’t even written/ran by regular Julia users!), and Flux lags a bit behind in some areas, but uses KNet as a goal post. Thus KNet can be used as a bridge to see how Flux compares to KNet which compares to the rest of the world. KNet sacrifices some generality to get there by relying on hardcoded CUDA kernels for many things, so Flux is trying to get “the hardcoded performance” but without relying on hardcoded kernels.
Thanks for the reply.
I did some a while ago that can be found here:
https://github.com/oxinabox/oxinabox.github.io/blob/master/_drafts/JuliaDeepLearningMeetupLondon2019/Regression%20Demo.ipynb
https://github.com/oxinabox/oxinabox.github.io/blob/master/_drafts/JuliaDeepLearningMeetupLondon2019/Classification%20Demo%20--%20Am%20I%20dressed%3F.ipynb
also compares several non-deep learning libraries
Keep in mind that Flux recent underwent a major change in back-end to Zygote.jl. I believe the first of @ChrisRackauckas’s posts was after the Zygote transition, but the others look like they come from before (correct me if I’m wrong).
It’s encouraging to see that the Flux performance is competitive, I know there had been significant concern about that in the past.
Yeah, mine were from before