Replacing my 2013 MacBook Pro with a M1 Pro or Max - Advice?

For me? Probably not.

Any updates on this? Am on the verge of buying the M1 Pro(16GB 8 core CPUs and 14 core GPUs) . Will be using Julia mostly to run stiff ODE’s with diffeq.jl and optimizing with respect to a small number of parameters using Optim.jl.

3 Likes

I got MacBook with M1 max and I mostly use Julia for ODEs and optimization. No regrets. It’s several times faster for the same tasks compared to my 2017 MacBook. Everything works with Rosetta. ODEs and JuMP even work with native 1.7. Native is about 1.5/2x faster based limited benchmarks (I’ll post a few benchmarks from ODEs and JuMP when I get a chance). There are some bugs with multithreading for both native and Rosetta where it gets deadlocked if you use too many threads (https://github.com/JuliaLang/julia/issues/41820) and a bunch of separate bugs with Native that have random segfaults (https://github.com/JuliaLang/julia/issues/41440) and some binaries don’t work. I assume the bugs will be improved significantly in the next few months since 25% of Julia users use Macs and they’re all switching to Apple Silicon containing Macs slowly.

4 Likes

I really wish my new M1 iPad could use this… current work around is to connect to code-server on my 4 year old iMac, so ridiculous…

I’m transferring from Time Machine to my new MacBook Pro as I type this. (FYI - FedEx delivered it a week early; I found it sitting on my deck like any Amazon package.)

1 Like

Let us know what you think! Also, what did you end up going with, M1 Pro or M1 Max?

1 Like

I ended up with an M1 Pro, 32GB and 1TB. I like it very much for its cool and efficient operation and satisfying speed. However, it has terrible memory leaks that have, at times, especially early on, left me running on swap during mundane operations. Things are improving, but I’m annoyed the gushing reviewers don’t talk about this issue - nor does Apple - though it is a known and expected issue for such a new architecture.

1 Like