[ANN] julia-mcp — persistent Julia sessions for AI assistants

If you use AI coding assistants (Claude Code, Claude Desktop, VS Code Copilot), you’ve probably noticed they pay Julia’s startup and compilation costs on every single code execution and test run. That adds up fast during iterative development.

julia-mcp is an MCP server that keeps Julia sessions alive across calls. Variables, functions, and loaded packages persist between executions, and the assistant can iterate on code without waiting for recompilation each time.
Works for tests as well!

Key design choices:

  • Fully automatic — sessions start on demand, recover from crashes, and clean up on shutdown. The only setup is a one-time MCP registration.

  • Isolated per project and per AI session — each project directory gets its own Julia process. No cross-contamination between environments and sessions.

  • Pure stdio — no TCP ports or sockets to manage.

  • Revise.jl is loaded automatically if available, so code changes are picked up without restarting.

There are other projects in this space — MCPRepl.jl, REPLicant.jl, DaemonConductor.jl — each with a different take.
julia-mcp’s angle is zero manual management: you register it once and forget about it.

Setup for Claude Code:

cd /any_directory
git clone https://github.com/aplavin/julia-mcp.git
claude mcp add --scope user julia -- uv run --directory /any_directory/julia-mcp python server.py

The server is written in Python (the MCP protocol ecosystem is most mature there), the only dependencies are uv and Julia itself.

Repo: GitHub - aplavin/julia-mcp: MCP server for persistent Julia sessions — fast iteration without startup/compilation overhead.

Feedback and suggestions welcome!

25 Likes

replace /path/to/julia-mcp with the actual path where you cloned the repo.

Sorry, can you clarify what does this means? What repo? A simple true case example would be better than this description.

You mean, where this landed?

git clone https://github.com/aplavin/julia-mcp.git

Tweaked the example in the post and in the README! Is it more clear now?

Yes.

Yes, I guess it was the mention to Python that got me confused to what the server meant.

I love seeing all the experimentation here. I tried a similar path (using a Jupyter kernel for a persistent Julia session with MCP) but ultimately I wanted a REPL that I could seamlessly interact with manually or with Claude. It turns out that Claude is quite good at reading from/writing to tmux sessions, so I basically just ended up with a skill and a few scripts to have it launch a juliarepl_project tmux session.

This is definitely more of a DYI approach, but I’d recommend it to people who enjoy heavily customizing their workflows.

I think the terminology surrounding MCP is a bit weird to be honest :slight_smile: Especially the “server” name can sound confusing in the common case when the LLM starts a new “server” for each session anyway, they are not shared and just communicate through stdio.

Yes, the focus here is definitely not towards a “shared” repl between LLM and person.
julia-mcp is intended to be a better (more efficient) way for LLM to run Julia code and tests, with the design goal to be fully automatic without any management.
Nothing more, nothing less (:

1 Like

Awesome! I tried to build something like that (for the same purpose) using DaemonMode.jl very recently, but didn’t manage to make it work.

I want to distribute a modified version of the code. I don’t see a license so I wanted to check if you are okay with that.

1 Like

Sure! Added MIT license.
Also, curious to see your modifications, anything in particular you found lacking? :slight_smile:

2 Likes

Nice, will try it out!
For Cursor users encountering errors: the shell that launches the MCP server doesn’t have access to uv in PATH. I needed to change the command from "uv" to the full path "path/to/your/uv" (found with which uv in a normal terminal).

Currently I’ve modified it to remove the use of TestEnv and setting the “–startup-file=no”, and “–threads=auto” flags on startup, but those might be specific for how I am using julia right now.

1 Like

Today when trying claude code with using this mcp, the tui complete froze up and started leaking gigabytes of memory. This happened after claude ran some julia code that seg faulted. Not sure if this is a bug in claude code or in the mcp.

1 Like

I’m not familiar with that, but happy to improve installation instructions! How is this typically handled in Cursor for other MCPs written in Python?

Hmmm, interesting… I indeed assumed “everyone” would want to run Julia with these flags :slight_smile: Can add an option, of course.

1 Like

I am not sure if there is any standard way specifically for Cursor. I also think this may not be a Cursor-specific issue, but that it depends on how uv is installed. For me, which uv is ~/.local/bin, and i add that to PATH in my .bashrc, so ~/.local/bin is not included in the system-wide path. So i would imagine this could be a hiccup for any agentic tool, not just Cursor. My config (.cursor/mcp.json):

{
  "mcpServers": {
    "server1": {
	...
    },
    "server2": {
      	...
    },
    "julia": {
      "command": "/home/username/.local/bin/uv",
      "args": [
        "run",
        "--directory",
        "/home/username/servers/julia-mcp",
        "python",
        "server.py"
      ]
    }
  }
}

Edit: using absolute paths in config, to avoid ENOENT error using Cursor CLI.

1 Like

See the updated version: now, can override the default CLI arguments, pass any arguments to the Julia process.

1 Like

Thanks. I used this with devcontainer (with devpod) and claude code to create a Julia library for working with RDF, SPARQL and SHACL. I cannot (yet) vouch for the LinkedData.jl library I vibed, but it sure was fun to create. Julia-mcp made it quicker to build. And, the devcontainer setup worked well. ahjulstad/LinkedData.jl: Vibe-coded RDF tools for Julia

1 Like

I am having issues with the AI writing Julia code to edit or remove files in the .julia folder, because it somehow thinks that it needs to fix something broken in julia itself. What kind of sandboxing do people use to prevent this?

Honestly I never saw anything like that!