Have you considered using a jupyter notebook?
That should allow using a remote julia session (which stays alive)
Or is this not an option for you?
You can check this out on juliabox
Apologies if I misunderstood your problem.
Have you considered using a jupyter notebook?
tmux are the canonical solutions to this problem.
Juno can automate that process for you pretty well, if you want to give it a go.
Thank you for the suggestion but I really need to start my simulations from the command line. Carlo
Actually I was planning to use Revise if I manage to get this working. Thank you
@johnh and @pfitzseb
I knew about screen but it didn’t came into my mind to solve the problem. I think with screen I could get what I want but if I could attach and detach to a REPL using the same tools used for distributed computing I could also get the results of my simulations which could help me understand if everything went ok or not. Probably I can do it also with screen but it would be more complicated.
Attaching a REPL to a running remote process is in principle possible but no one has done the work to make it happen. Feel free to open a feature request issue on GitHub. Note that there are pretty hefty security implications of allowing such a thing, so it’s not just a matter of making it work, but also of making it work safely.
Thank you for the feedback. Would it be a quite unsafe thing also using the Distributed.jl package infrastructure? I don’t see a big difference between attaching to a running Julia process and distributed computing with Julia.
True, being able to start a REPL on any worker would be safe and probably easier to implement since the “RPC” already just works.
For server-side long-running applications (everything web dev) this would be a very valuable feature. For reference, it is available on Erlang/Elixir. The value resides in the fact that
a. one can attach to a running session to debug issues and inspect application state
b. we could theoretically update the running code by manually triggering Revise to inject application updates without restarting the app (no downtime)
If you start remote IJulia via
remote_ikernel you can attach as many local clients as you want. The catch is that you have to start IJulia via
remote_ikernel and loosing connection will kill the remote process. I’ve been wanting to write a “Jupyter kernel proxy” that can bridge the pre-existing remote Jupyter kernel and local client though SSH. This would help not only Julia community but also entire scientific computing community.
FWIW, I am using
tmux on our computational server. It is robust and extremely convenient.
I exchange data and results back and forth using
rsync or a private repo on Gitlab.
+1 for tmux, I’ve also used it with Julia apps started from the REPL. The downside is tmux has a pretty high learning curve (and I still can’t remember the shortcuts, I’m having all these tmux cheat-sheets around). Will give
screen a try, seems more user friendly.
I use basically 3-4 of them, for creating a new terminal, stepping back and forth between them, and killing an unresponsive one.
No doubt there must be a lot of excellent features I am missing with this approach, but I don’t spend enough time in a terminal to make learning all of them worthwhile.
A quick follow up based on my current struggles at work:
screen works in some cases, but in production, most certainly, we need the web apps demonized so that they’re automatically restarted upon crashing. This means that the Julia process will be started by something like
supervisord not by the user, so we need to attach to that process.
2 - found
byobu which is a more user friendly wrapper around
How about launching a IJulia kernel and then use
@async to start your webapp inside of it? You can then attach a REPL to it using
jupyter console etc.
Interesting! I have 0 knowledge of IJulia’s internals - would that carry a significant performance penalty?
I don’t think you need to know IJulia internal to use it. There is no direct API to do it but I think you can put something like
@spawn start_my_app() in
~/.julia/config/startup_ijulia.jl. Yeah, I know it’s a bit of a dirty trick but maybe a good start for trying things out. Guarding it with some environment variable (which is set in
supervisord configuration) probably is a good idea. So, the invocation you need to use (e.g., put in
supervisord configuration) would be something like
START_MY_APP=yes jupyter console --kernel=julia-1.x. To attach to it, use
jupyter console --existing KERNEL_ID. See: https://jupyter-console.readthedocs.io/en/latest/
Sorry, I’ve never tried this (IJulia + web app) myself so don’t know. I know that a Jupyter kernel needs to open some ZMQ connections (which uses a bunch of TCP ports) to talk with Jupyter frontends. Maybe this can interfere with I/O performance of your web app? For CPU-bound task, my guess is that it won’t matter since people do computations in Jupyter notebooks and they’d complain if there is any performance disadvantage.
Actually, creating a
-L myapp.jl julia options may be the simplest and cleanest solution: https://jupyter-client.readthedocs.io/en/stable/kernels.html#kernel-specs
Going down this path, even simpler would be to have
supervisord starting a
tmux session which starts the Julia app. Then one can simply attatch to the
I just tried it and I can start a new
byobu session to start a Genie app with:
byobu new-session -s "session_name" "bin/server"
This can be passed to
Unfortunately, not so easy… By default
supervisord would supervise
tmux I think it can be set up so that the Julia app creates a
supervisor watches that. Will investigate, but it takes more digging…
After more digging it’s a total no go. Looks like
tmux can’t be started through
What about something like the following? Adding an authentication to this is easy
## Run this on the server using Sockets @async begin server = listen(2000) while true sock = accept(server) @async while isopen(sock) @info "Receiving content" content = String(readavailable(sock)) @info "Running: $content" try redirect_stdout(sock) do redirect_stderr(sock) do res = eval(Meta.parse(content)) res !== nothing && println(sock, res) end end catch e @warn e end end end end ## Client using Sockets serv = connect(2000) @async while isopen(serv) println(stdout, readline(serv)) end macro |(content) c = string(content) return :(println(serv, $c); flush(serv)) end @| begin variable = 10 end @| variable # prints 10