@ChrisRackauckas Thankyou for the explanation on Titan Black and the licensing.
I worked with an oil and gas company which had literally thousands of GPUs in production.
As these were in data centres they could not update the drivers beyond that crucial version.
I did try to evangelise Julia when I was there, and I did find out they had funded some Julia work in the seismics field a couple of years ago.
new blog post: “Which GPU(s) to Get for Deep Learning: My Experience and Advice for Using GPUs in Deep Learning”
2020-09-07 by Tim Dettmers
bonus:
Tim Dettmers: “In the community aspect, AMD is a bit like Julia vs Python. Julia has a lot of potential and many would say, and rightly so, that it is the superior programming language for scientific computing. Yet, Julia is barely used compared to Python. This is because the Python community is very strong. Numpy, SciPy, Pandas are powerful software packages that a large number of people congregate around. This is very similar to the NVIDIA vs AMD issue.”
Just saw that post as well, what an unexpected passage!
Perhaps the nuance here is that Julia does have a healthy and established community that a smaller (but still significant) number of people congregate, whereas ROCm unfortunately does not. I’d concede the last sentence if the author replaced numpy/scipy/pandas with torch/tensorflow because Julia’s deep learning community is proportionally much smaller than Python’s. Anecdotally, all of my research code is and continues to be written in Python. However, one would be hard-pressed to find a language community that has the same confluence of “backend” implementation experience, application domain knowledge and enthusiasm for implementing a competitive ML/DL ecosystem. That still leaves an organizational and human resource deficit, but we’re working on that