NeuroREPL.jl – Community Interest Check

Hey, NeuroREPL.jl is a Julia implementation of the Model Context Protocol (MCP) that transforms the Julia REPL into a programmatic API for AI agents. NeuroREPL provides process‑isolated code execution, dual MCP transport support, and introspection tools specifically designed for agent workflows.

The system is built on Malt.jl for worker process isolation, HTTP.jl for dual transport (SSE and Streamable HTTP), and provides 28 specialized tools across 8 functional categories including core execution, package management, code introspection, performance analysis, and server control. The architecture separates the MCP server from the execution environment, ensuring that code execution cannot crash the server while maintaining persistent state across tool calls.

NeuroREPL.jl is inspired by the MIT‑licensed MCPRepl.jl project but represents a significant rewrite focused on improved agent communication, and enhanced debugging capabilities. The implementation is contained in a single ~2000 line Julia module for ease of modification and understanding. Initially my idea was to use Distributed.jl, however, a colleague of mine (I hope he does not mind) suggested Malt.jl and I think it could be a right choice (thanks RA).

Testing has been performed with Kiro CLI v1.25.0 (AWS), Crush v0.43.0 (Charm), and OpenCode v1.2.10. While core functionality is rather solid, the package remains in active development and is not quite production ready.

Some current limitations include:

  • No LSP Integration
  • No Debugger Integration
  • No Full REPL Features in Worker Mode
  • No Binary/Image Output Support
  • Limited Error Recovery
  • Single Worker Process for MCP Requests
  • No Explicit Session Management via MCP Protocol
  • No Security Layer

Potential Enhancements include:

  • Parallel Agent Collaboration
  • Multi-Model Solution Validation and Ranking
  • Meta-AI System: Competitive + Collaborative Multi-Model Approach

In case there is any interest from the Community, I would be happy to publish the code. I was thinking about using the MIT License, however, in case of anything more substantial and/or some serious ideas, of course, I remain open to suggestions.

A screenshot showing the starting sequence and functionalities is attached below.

1 Like

I’d so definitely download and set it up on a LAN server.