Julia in Colab

Hi Julia community!

I’m new to posting here (this is my first one :-)) and wasn’t sure the right place to post, but I’d like to share that Colab now provides the Julia language in our runtimes (Julia as a language in runtimes · Issue #5151 · googlecolab/colabtools · GitHub).

Right up front, I work for Colab, and I think the Julia language is pretty awesome. I have some math and data science background and Julia has felt like a very natural language and I’ve wanted to see it in Colab for some time (as part of the original three in Jupyter).

Currently, we don’t provide a lot of pre-installed packages right now but I’d love to hear feedback about your experiences of the Julia language in Colab and where it hits and where it misses!

89 Likes

You were beaten in the announcement by other very excited users:

7 Likes

Indeed :slight_smile: Let’s make @Metrizable’s first post the main thread! I will update mine to link here.

8 Likes

This is awesome!!!

By the way, if you want to run your julia code on cloud TPUs, I confirmed that using Reactant works in the new Julia Colab (Google Colab)

21 Likes

This is huge, thank you! Is it possible to use Colab for teaching an introduction course for Julia?

3 Likes

I would imagine so. At SIGCSE 2025, this was definitely a topic that came up.

1 Like

Wow, neat! I wasn’t aware of Reactant. Nice!

2 Likes

Wow! This is incredible news @metrizable! It sounds like you had an outsize role in making this happen alongside your team – many thanks! This has been a requested feature for years so it is exciting to see it happen.

I know it is your first time posting on Discourse so welcome to the Julia Discourse! You are also welcome to join the Slack community where folks are having an excited discussion about the news!

Cheers!

~ tcp :deciduous_tree:

6 Likes

A quick and dirty CPU vs GPU benchmark! Using the (free!) “T4 GPU” runtime. Stopped at 10^8 because 10^9 exceeded either the 12.7GB of RAM or 15.0GB of VRAM.

using Plots, CUDA, BenchmarkTools

pMax = 8
powerVector = 1:pMax
timeVectorCPU = Vector{Float16}(undef, pMax)
timeVectorGPU = Vector{Float16}(undef, pMax)

for p in powerVector
    n = 10^p
    xCPU, yCPU = (ones(n), ones(n))
    xGPU, yGPU = (cu(xCPU), cu(yCPU))

    timeVectorCPU[p] = @belapsed $xCPU + $yCPU
    timeVectorGPU[p] = @belapsed $xGPU + $yGPU
end

timeVectorCPU |> display
timeVectorGPU |> display

plot(
    10 .^ powerVector, 
    [timeVectorCPU timeVectorGPU], 
    label = ["CPU" "GPU"], 
    title = "CPU vs GPU",
    xscale = :log10,
    yscale = :log10,
    ylabel = "Elapsed time [s]",
    fmt = :png
)

16 Likes

Can we run PLUTO notebook in Google Collab?

That’s good news, now we have to knock the door on Kaggle Notebooks :face_with_hand_over_mouth:

1 Like

Lazy question: how do I set a path to subfolders? (My notebook is at Google Drive and I can run it, but when I do pwd() I get "/content")

As of a couple months ago, Kaggle started deriving their Docker images from Colab’s public releases, so this is a real possibility in the near future. Colab hasn’t pushed the image with the Julia kernel to the public repo yet, but some time following this at least the bits should be there.

8 Likes

Congratulations!

That’s a big thing and a great opportunity for growing up the community.

Ju Pyt R, revisited :balloon::upside_down_face:

3 Likes

Currently in Colab, we don’t provide a simple way to determine the path in Google Drive of the notebook you’re editing. However, I am aware of code snippets to allow this discovery, but they’re a little slow.

One way to access the artifacts on Google Drive is to mount it. A smooth UX in Colab for Julia is not present, but it is available in Python. Until something is available, one “hacky” way could be to use the Python runtime to mount Google Drive (and it remains mounted), then switch to the Julia runtime (via the Runtime menu (or patching the session)).

I’m aware of some fuse commands/rclone/etc approaches are out there, too.

1 Like

I was trying just now to directly use google.colab from within Julia, with some limited success

1 Like

Agreed. I had tried similar, but it seems the Python that’s invoked is not associated with the IJulia kernel. There may be something more we could do here to hook those up.

It would also be cool to build up native Julia tooling to replicate functionality found in the google.colab Python package, like colabtools/google/colab/drive.py at e8519e12f553b0597c0e067cd9e4df821bdc6b2e · googlecolab/colabtools · GitHub.

5 Likes

(post deleted by author)

Can GPU be used? I tried to add CUDA.jl, but it said

CUDA.jl could not find an appropriate CUDA runtime to use.

First in the “Select a runtime”, you need to choose the GPU accelerator instead of the CPU. Then Pkg.add("CUDA") to the environment (they’re working on getting CUDA pre-added to GPU runtimes; see metrizable’s link). THEN we can using CUDA; CUDA.versioninfo().