How to attach to an existing remote REPL?

Have you considered using a jupyter notebook?
That should allow using a remote julia session (which stays alive)
Or is this not an option for you?
You can check this out on juliabox
Apologies if I misunderstood your problem.

screen or tmux are the canonical solutions to this problem.
Juno can automate that process for you pretty well, if you want to give it a go.


Thank you for the suggestion but I really need to start my simulations from the command line. Carlo

Actually I was planning to use Revise if I manage to get this working. Thank you

@johnh and @pfitzseb
I knew about screen but it didn’t came into my mind to solve the problem. I think with screen I could get what I want but if I could attach and detach to a REPL using the same tools used for distributed computing I could also get the results of my simulations which could help me understand if everything went ok or not. Probably I can do it also with screen but it would be more complicated.

Attaching a REPL to a running remote process is in principle possible but no one has done the work to make it happen. Feel free to open a feature request issue on GitHub. Note that there are pretty hefty security implications of allowing such a thing, so it’s not just a matter of making it work, but also of making it work safely.

Thank you for the feedback. Would it be a quite unsafe thing also using the Distributed.jl package infrastructure? I don’t see a big difference between attaching to a running Julia process and distributed computing with Julia.

True, being able to start a REPL on any worker would be safe and probably easier to implement since the “RPC” already just works.

For server-side long-running applications (everything web dev) this would be a very valuable feature. For reference, it is available on Erlang/Elixir. The value resides in the fact that
a. one can attach to a running session to debug issues and inspect application state
b. we could theoretically update the running code by manually triggering Revise to inject application updates without restarting the app (no downtime)

1 Like

If you start remote IJulia via remote_ikernel you can attach as many local clients as you want. The catch is that you have to start IJulia via remote_ikernel and loosing connection will kill the remote process. I’ve been wanting to write a “Jupyter kernel proxy” that can bridge the pre-existing remote Jupyter kernel and local client though SSH. This would help not only Julia community but also entire scientific computing community.

1 Like

FWIW, I am using ssh + tmux on our computational server. It is robust and extremely convenient.

I exchange data and results back and forth using rsync or a private repo on Gitlab.

+1 for tmux, I’ve also used it with Julia apps started from the REPL. The downside is tmux has a pretty high learning curve (and I still can’t remember the shortcuts, I’m having all these tmux cheat-sheets around). Will give screen a try, seems more user friendly.

1 Like

I use basically 3-4 of them, for creating a new terminal, stepping back and forth between them, and killing an unresponsive one.

No doubt there must be a lot of excellent features I am missing with this approach, but I don’t spend enough time in a terminal to make learning all of them worthwhile.

A quick follow up based on my current struggles at work:

1 - tmux / screen works in some cases, but in production, most certainly, we need the web apps demonized so that they’re automatically restarted upon crashing. This means that the Julia process will be started by something like supervisord not by the user, so we need to attach to that process.

2 - found byobu which is a more user friendly wrapper around tmux and screen

How about launching a IJulia kernel and then use @spwan/@async to start your webapp inside of it? You can then attach a REPL to it using jupyter console etc.

Interesting! I have 0 knowledge of IJulia’s internals - would that carry a significant performance penalty?

I don’t think you need to know IJulia internal to use it. There is no direct API to do it but I think you can put something like @spawn start_my_app() in ~/.julia/config/startup_ijulia.jl. Yeah, I know it’s a bit of a dirty trick but maybe a good start for trying things out. Guarding it with some environment variable (which is set in supervisord configuration) probably is a good idea. So, the invocation you need to use (e.g., put in supervisord configuration) would be something like START_MY_APP=yes jupyter console --kernel=julia-1.x. To attach to it, use jupyter console --existing KERNEL_ID. See:

Sorry, I’ve never tried this (IJulia + web app) myself so don’t know. I know that a Jupyter kernel needs to open some ZMQ connections (which uses a bunch of TCP ports) to talk with Jupyter frontends. Maybe this can interfere with I/O performance of your web app? For CPU-bound task, my guess is that it won’t matter since people do computations in Jupyter notebooks and they’d complain if there is any performance disadvantage.

Actually, creating a kernel.json with -L myapp.jl julia options may be the simplest and cleanest solution:

Going down this path, even simpler would be to have supervisord starting a tmux session which starts the Julia app. Then one can simply attatch to the tmux session.

I just tried it and I can start a new byobu session to start a Genie app with:
byobu new-session -s "session_name" "bin/server"

This can be passed to supervisord

Update 1

Unfortunately, not so easy… By default supervisord would supervise tmux :frowning: I think it can be set up so that the Julia app creates a pidfile and supervisor watches that. Will investigate, but it takes more digging…

Update 2

After more digging it’s a total no go. Looks like tmux can’t be started through supervisor.

1 Like

What about something like the following? Adding an authentication to this is easy

## Run this on the server
using Sockets
@async begin
    server = listen(2000)
    while true
        sock = accept(server)
        @async while isopen(sock)
            @info "Receiving content"
            content = String(readavailable(sock))
            @info "Running: $content"
                redirect_stdout(sock) do
                    redirect_stderr(sock) do
                        res = eval(Meta.parse(content))
                        res !== nothing && println(sock, res)
            catch e
                @warn e

## Client
using Sockets
serv = connect(2000)
@async while isopen(serv)
    println(stdout, readline(serv))
macro |(content)
    c = string(content)
    return :(println(serv, $c); flush(serv))
@| begin variable = 10

@| variable
# prints 10
1 Like