Did Modular just reinvent Julia?

…but with a more pythonic syntax: Mojo 🔥: Programming language for all of AI


Looks like it’s not open source?

Edit: not yet (Modular Docs - Mojo🔥 FAQ)

Will Mojo be open-sourced?

Yes, we expect that Mojo will be open-sourced. However, Mojo is still young, so we will continue to incubate it within Modular until more of its internal architecture is fleshed out. We don’t have an established plan yet.


Still seems in the very early stages. e.g. “Mojo’s standard library has not yet grown a standard list or dictionary type”, no anonymous functions, “no polymorphism” (!) …


Seems to be OOP, which in my opinion in the wrong approach for scientific programming.


Did Modular just reinvent Julia?

They want it be be a faster Python, a superset of Python that is fully compatible with all of the Python ecosystem. They repeated that several times. It will be different from all the other approaches to fast Python. This time they’ll get it right. But I couldn’t understand how it’s supposed to achieve its goals (or exactly what they are). Except that they will follow the model of integrating Swift and Objective-C, two statically compiled languages that are highly compatible with C. It doesn’t seem to make sense, there must be something I’m missing.

Also, what does this mean?

Mojo still doesn’t support classes, the primary thing Python programmers use pervasively! This isn’t because we hate dynamism


From what I can see this is an extension to python and not really a programming language in it’s own right. More comparable to PyPy perhaps? Either way looks cool but I think I’ll stick with Julia for now. Too many new languages popping up all the time. :joy: Rust, Zig, Nim, Carbon etc. I’m super behind.


Chris Lattner did compare to Julia on HN.


Naming “ownership and no GC” as “technical advancements” is a little odd to me.

Edit: that said, I’m very happy to see an encouraging reply written by Keno!


This is impressive: Modular Docs - Matrix multiplication in Mojo
I still wouldn’t want to program in Python, though.


@PetrKryslUCSD I think there was a post somewhere (twitter or HN?) about a simple LoopVectorization Julia matmul implementation that was way faster than that. Can’t find it now, maybe someone has the link?

1 Like

Probably this HN comment by @staticfloat.

From what I understand the metaprogramming features in Mojo are more limited than in Julia. I wonder if something like LoopVectorization.jl would be possible in Mojo: a third party package that takes “naive” code and optimizes it with SIMD et al.


That’s it, thanks.

Actually, something better than LoopVectorization. I discussed this very thing with Chris on the discord. LV was re-written in Cpp, no need to do that given Mojo’s extensive parametric zig like comptime system.

1 Like

Are you saying LoopVectorization.jl was rewritten in C++?

“On the discord” means a private conversation with Chris, not a public channel there, right?

They are probably referring to this ongoing project GitHub - JuliaSIMD/LoopModels (it even has a blog with interesting dev updates)

LV is still alive and great. LM might be an even better future version of it, but it is still under development.


It’s public, in a side thread in the modular discord. Basically it seems that Mojo’s metaprogramming solves a lot of problems people have been clamoring about regarding generated functions and such.


Except for maybe the tiling, couldn’t most of the speedups demonstrated in that Mojo matrix multiplication demo be done in Julia as well?

1 Like

matrix tiling can be done in Julia also.


Is there a package for autotuning tiling?


So LoopModels is written in C++, while LoopVectorization was in Julia? What happened? That seems opposite to the general trend of moving things into Julia. Does this have to do with fundamental weaknesses in Julia? I’m quite surprised.

Edit: I guess actually reading the readme.md is instructive, apparently it’s working on the llvm level. Can other languages also benefit from LoopModels then?