Has anyone got a LLM to work with any Julia editor/IDE/whatever?

I’ve installed ollama with DeepSeek Coder v2, and it seems to be plenty fast enough on my old Mac laptop, but I can’t figure out how to plug it into any editor. I’ve tried vscode with continue but somehow it isn’t doing anything. I get a continue window, then it indexes my codebase, but then nothing. I can chat with my LLM but it is completely unaware of my Julia code. I’m not seeing any code completion or tab completion when I edit my .ipynb files.

Also, the chat window is all sorts of janky. I can’t reliably delete anything, no matter how hard I click on the little garbage bin.

Has anyone got this to work? It seems like it’s basically there but I’m just too dumb to get it to go…

Edit: I’ve now tried Jupyter AI. I asked it what my code does, and it randomly picked a docstring and printed it. On my second question, Deepseek-coder-v2 started talking to me in Chinese. I’m also not sure jupyter-ai integrates with the .jl file editor.

I have been using Cursor. It seems to do ok? I have not managed to become a power user of it yet, but it has definitely helped me out plenty of times.

1 Like

I’m looking into cursor, thanks for the tip. :slight_smile:
Have you figured out how to connect it to a local LLM?