How to start using LLMs to help write Julia code?

,

Continuing the discussion from Is Julia 2.0 needed?:

I’m curious how people are integrating AI tools to help them code. Seems this is growing in popularity, but I’ve not really looked into how to do it myself yet.

Are there free options available, or is everyone paying monthly subscriptions for this help?
Is Copilot the only option that works with VSCode, or are there ways to integrate other LLMs?

The VSC Codeium extension works nicely and is free.

6 Likes

I’m using copilot, but I have a free education account.

(it would be cool to have something like that integrated into the Julia REPL, through a package).

2 Likes

(it would be cool to have something like that integrated into the Julia REPL, through a package).

See current options here: GitHub - svilupp/awesome-generative-ai-meets-julia-language: Comprehensive guide to generative AI projects and resources in Julia.

4 Likes

The latest VSCode (updated today) says that it has “copilot support for the native REPL”, being the “native REPL” something that is used by the Python extension.

In the Julia REPL I didn’t see anything happening. Anyone knows what is that about?

I am using aider with an Claude Sonnet 3.5 API key. However, for some tasks it works best to first generate python code and then translate python code to Julia.

aider works straight from your command line (a sort of TUI) and therefore is used in parallel of your choice of editor. I then use my editor to inspect and adjust the generated git diff.

Continue + Ollama is also easy to setup : An entirely open-source AI code assistant inside your editor · Ollama Blog