That still wouldn’t fix the bigger problem Valentin raised which is
I assumed some sort of depot did exist, but I didn’t check myself
Sounds right to me. Indeed it would be fantastic if Colab packaged a precompilation cache, regardless of what packages sit in the base environment.
However one worry I have here is cache invalidation from new packages—note that Colab does not update the packages that frequently IIRC. I think artifacts would be tricky also, in case the new packages require new artifacts. I think a larger base env (with frozen versions), while it isn’t perfect, does mean you can get a start working quicker. Similar to Python—it can take a while to install PyTorch, so I don’t mind that Colab freezes the version at a slightly older value.
Oops! My brain said “I’m using the +(::CuArray, ::CuArray); I’m good to go.”, but timing just the kernel launch strikes again! I thought it was weird that the GPU line was flat lol; I chocked it up to the GPU might not be saturating even at those longer arrays (which it probably definitely should at 10^8 elements). Thanks for catching that!
I have edited my original post to disclose this and linked to your post!
I believe you can have Javascript run by display(MIME("application/javascript"), yourjs_string), so that can be a “listener” to a message…I was experimenting with this a bit for WGLMakie support but probably wasn’t using the IJulia interface correctly.
There is an issue opened : Support Pluto notebooks for Julia · Issue #5167 · googlecolab/colabtools · GitHub
Any plans to support the latest stable version of Julia ? (Currently 1.11.3).
On the roadmap, but a little ways out.
We did just upgrade to 1.10.9 following its release a couple days ago. We also pre-install CUDA.jl now (configured to use system-installed CUDA), to save some time.
This is an interesting idea. It’s slightly different than our current experience as @MilesCranmer mentioned, so not sure where it will land, but one we’re thinking on.
This seems a bit concerning to me and probably worth filing an issue with a minimal repro.
This worked for me! Thank you very much for the “hacky”.
Would really love to see this, especially a smooth UX for mount Google Drive!
Hi guys, I don’t think I’m able to get Reactant to find any Nvidia GPUs when I select a GPU runtime on Colab. E.g. when I select A100 or the L4 GPU, and then try to run:
Reactant.set_default_backend(“gpu”)
I get the following error:
No GPU client foundStacktrace: [1] error(s::String) @ Base ./error.jl:35
[2] client(backend::String) @ Reactant.XLA ~/.julia/packages/Reactant/cTiTU/src/xla/XLA.jl:82
[3] set_default_backend @ ~/.julia/packages/Reactant/cTiTU/src/xla/XLA.jl:104 [inlined]
[4] set_default_backend(backend::String) @ Reactant ~/.julia/packages/Reactant/cTiTU/src/Reactant.jl:293
[5] top-level scope @ In[15]:1
This thread has been about the inclusion of Julia as a backend for colab. Because your question is about using Julia in colab, I think it would be more appropriate as its own topic under the colab tag.
Would you mind opening an issue in Reactant.jl?
Sure thing!
Thanks so much for setting up the julia colab, this is great! Similarly struggling with very slow precompilation times for custom set-ups, I opened an issue with some MWE: Julia pkg loading and compilation slow · Issue #5608 · googlecolab/colabtools · GitHub