Julia 1.7 even got a special version for M1, does that mean it is a hypertime to buy a Macbook pro?

Any body got experience on coding Julia on M1? doing deep learning and anything else? Welcome to share with us, thanks.

No, it is not “hypertime to buy”.
Support is tier 3. Segfaults and other issues abound. I do not find native apple silicon usable for much non-trivial.

Rosetta works well. You can run an AArch64 Linux VM, which seems to avoid the issues in practice (under my light testing). These can both still be quite fast.

It’ll work and I think lots of people are very happy with the M1 + Julia, but I wouldn’t call it “hypertime” until the segfaults and threading hangs are fixed.


Highly agree with @Elrod.
I’m using M1 Macbook Air, and it’s quite nice to run simple codes with Rosseta 2.
However, when I changed to the native run of Julia 1.7, I immediately found that there are some incompatibility issues like this; especially, I cannot use SCS.jl anymore.
So I turned back to Rosseta.

In short, I recommend you to forget the native run for now.


Thanks bros

Like the others have said, it’s not ready yet but it is exciting because the native version does see a decent speed up! I’m chomping at the big for it to get to tier 1 support!


ha, maybe M2 max or M3 max then!! Thanks, guys!

I’m having very good luck with 1.7.1 on my Mac M1. I have not run into any of the things like


that keep it from Tier 1. I think I’ve given it a reasonable test (2 papers + book project). So far, so good.

It’s also really fast, even with openblas.

1 Like