Is the freezing / deadlock issue supposed to be fixed? I am running Atom 0.12.7 on Mac OS with Julia 1.3, and Atom seems to freeze at random timepoints when executing code blocks. Upon interruption, I get the following stacktrace:
ERROR: InterruptException:
Stacktrace:
[1] try_yieldto(::typeof(Base.ensure_rescheduled), ::Base.RefValue{Task}) at ./task.jl:611
[2] wait() at ./task.jl:668
[3] wait(::Base.GenericCondition{Base.AlwaysLockedST}) at ./condition.jl:106
[4] blockinput() at /Users/sdobber/.julia/packages/Atom/QOKyk/src/repl.jl:97
[5] evalrepl(::Module, ::String) at /Users/sdobber/.julia/packages/Atom/QOKyk/src/repl.jl:204
[6] top-level scope at /Users/sdobber/.julia/packages/Atom/QOKyk/src/repl.jl:260
This still does not get me back to the REPL, and I am forced to stop Julia.
Is the freezing / deadlock issue supposed to be fixed?
Supposed to be? Yes. Did we catch all cases where a deadlock might happen? No.
Would be very useful if you could narrow down the “random timepoints” when the freeze happens.
I’ll try my best to narrow it down, but “random” was the best I could come up with… Sometimes Atom works fine for hours, sometimes it freezes after 30 seconds in a new Julia session. Every time it does it seems like a trivial operation to me, like using DifferentialEquations or assigning a variable.
On my computer, it seems though that it got better for me at one point yesterday, where the only real difference was that all packages different from Atom.jl were precompiled again. Do you think this could have an effect on the issue? (I’m not deep enough into Julia myself to understand if and how different packages influence each other during precompilation.)
I updated ink and julia-client to v0.12.2, then Atom.jl to v0.12.7, and I get a very strange bug : I can’t execute using Plots (or any other package) as first instruction in the console, I get a segmentation fault. But anything is fine if I evaluate another expression, e.g. 2+2, before evaluation of using !
My operating system is Ubuntu 18.04, and my version of Atom is 1.44.0 x64.
I have a similar issue where, after working on one Julia session for around 1h and then stopping the session and attempting to start a new one, Julia gets stuck in a deadlock.
I do not get this issue when starting a new Julia session and killing it instantly after a fresh Atom start. I have only gotten it when working with Julia for a little while and then killing it and then attempting to restart.
The only way for me to get around this issue at the moment is by restarting Atom.
I am using
julia-client: Version: 0.12.2
Atom v0.12.7
Cell highlighting is now more subtle; we also added instructions on how to customize it (#700).
Fixed a bug where top-level macro calls could sometimes erroneously end up in the outline view (#295).
You’ll need to be on the following package versions:
Atom.jl@0.12.8
Juno.jl@0.8.1
julia-client@0.12.3
ink@0.12.3
Also note that there still is a chance that Julia will segfault when using Juno together with Julia 1.0.x – we weren’t able to figure out why this is happening. I’d recommend either upgrading your Julia version or reverting back to Juno 0.11 until the bug is fixed.
It seems to be better with regard to exiting the REPL and restarting it (in Windows). It will eventually hang, but now it takes a half dozen or so cycles, rather than one or two.
BTW, is the Juno.jl version (0.8.1) correct? ] update brought me to Atom v0.12.8, but Juno is at v0.7.2.
hmh, but we can’t improve/bugfix given just “the occasional hang”. It would be helpful if you can describe your environment, setup, when you happened to encounter the bug, MRE, etc.
Instructions to reproduce:
Start Atom
Press Enter in REPL window, wait for prompt, issue exit(), and repeat the process of stopping and restarting the session. Eventually it hangs looking like
julia> exit()
Julia has exited.
Press Enter to start a new session.
It can be restarted by using the reload command.
Doing this just now, it survived 7 cycles, then 1, then 2, then 5, before needing to be reloaded. I’m usually giving it maybe 5 seconds to see that it’s not just hesitating briefly, but the few times I’ve taken a break and let it sit (like right now typing this post) it has never woken back up.
I gave Juno 0.12 a new try, and I can’t work anymore. I will give any configuration details you need, here is only the error message :
signal (11): Segmentation fault
in expression starting at no file:0
jl_compile_linfo at /buildworker/worker/package_linux64/build/src/codegen.cpp:1191
emit_invoke at /buildworker/worker/package_linux64/build/src/codegen.cpp:3094
emit_expr at /buildworker/worker/package_linux64/build/src/codegen.cpp:3893
emit_ssaval_assign at /buildworker/worker/package_linux64/build/src/codegen.cpp:3615
emit_stmtpos at /buildworker/worker/package_linux64/build/src/codegen.cpp:3801 [inlined]
emit_function at /buildworker/worker/package_linux64/build/src/codegen.cpp:6262
jl_compile_linfo at /buildworker/worker/package_linux64/build/src/codegen.cpp:1159
emit_invoke at /buildworker/worker/package_linux64/build/src/codegen.cpp:3094
emit_expr at /buildworker/worker/package_linux64/build/src/codegen.cpp:3893
emit_ssaval_assign at /buildworker/worker/package_linux64/build/src/codegen.cpp:3615
emit_stmtpos at /buildworker/worker/package_linux64/build/src/codegen.cpp:3801 [inlined]
emit_function at /buildworker/worker/package_linux64/build/src/codegen.cpp:6262
jl_compile_linfo at /buildworker/worker/package_linux64/build/src/codegen.cpp:1159
emit_invoke at /buildworker/worker/package_linux64/build/src/codegen.cpp:3094
emit_expr at /buildworker/worker/package_linux64/build/src/codegen.cpp:3893
emit_ssaval_assign at /buildworker/worker/package_linux64/build/src/codegen.cpp:3615
emit_stmtpos at /buildworker/worker/package_linux64/build/src/codegen.cpp:3801 [inlined]
emit_function at /buildworker/worker/package_linux64/build/src/codegen.cpp:6262
jl_compile_linfo at /buildworker/worker/package_linux64/build/src/codegen.cpp:1159
emit_invoke at /buildworker/worker/package_linux64/build/src/codegen.cpp:3094
emit_expr at /buildworker/worker/package_linux64/build/src/codegen.cpp:3893
emit_ssaval_assign at /buildworker/worker/package_linux64/build/src/codegen.cpp:3615
emit_stmtpos at /buildworker/worker/package_linux64/build/src/codegen.cpp:3801 [inlined]
emit_function at /buildworker/worker/package_linux64/build/src/codegen.cpp:6262
jl_compile_linfo at /buildworker/worker/package_linux64/build/src/codegen.cpp:1159
jl_fptr_trampoline at /buildworker/worker/package_linux64/build/src/gf.c:1774
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2162
withpath at /home/j2b2/.julia/packages/Atom/wlPiw/src/eval.jl:9
jl_fptr_trampoline at /buildworker/worker/package_linux64/build/src/gf.c:1809
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2162
#262 at /home/j2b2/.julia/packages/Atom/wlPiw/src/completions.jl:18
jl_fptr_trampoline at /buildworker/worker/package_linux64/build/src/gf.c:1809
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2162
handlemsg at /home/j2b2/.julia/packages/Atom/wlPiw/src/comm.jl:168
unknown function (ip: 0x7fa1b5b55dec)
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2162
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1537 [inlined]
jl_f__apply at /buildworker/worker/package_linux64/build/src/builtins.c:556
#19 at ./task.jl:259
unknown function (ip: 0x7fa1b5b41094)
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2162
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1537 [inlined]
start_task at /buildworker/worker/package_linux64/build/src/task.c:268
unknown function (ip: 0xffffffffffffffff)
Allocations: 102007337 (Pool: 101985682; Big: 21655); GC: 221