State of machine learning in Julia

I just wanted to throw my two cents (I tried to read the whole tread to see if anyone said the same thing, I don’t think they did, but sorry if someone already said this):

I have recently heard (again) from people that they have seen the elegance of Julia/Flux, but they cannot lose the community/user database from their ML stack in Python (probably meaning Pytorch or TensorFlow, but maybe Jax too).

Then it hit me: these people are comparing the community of Python ML stacks with the community of Flux. But Flux (and to an even greater extent, Lux), as they say themselves, are really just some Julia code. So, we should compare the size of the Python ML stack community with the size of the Julia community, which is probably on par (or at least, closer order of magnitude).

Yes, there will be some specific questions that only the Flux/Lux community could answer. But overall, most questions could be answered from people that are not in that community, because these are ultimately questions about Julia.

People coming from Python do not understand how much code share exists in Julia. They are used to think that if you use Pytorch and you have a favorite function on TensorFlow that is not implemented in Pytorch, then too bad.

I think that Flux and Lux could try to advertise this more. Yes they say it is 100% Julia code, but this is not enough to get people from Python to understand. For instance, I always like to give the example of the number of lines of code between Flux and Pytorch, saying that both offer basically all the same functionalities.

12 Likes