blessed is “a curses-like library with a high level terminal interface API for node.js”. blessed.contrib is an extension of the library that “build dashboards (or any other application) using ascii/ansi art and javascript”
blessed.contrib looks amazing with its terminal based dashboards. I would like to have this functionality from Julia except I don’t even know where to begin or how to wrap a library.
I am hoping someone in the community can pick this up - those that obviously have some available time and know a bit of javascript. I will learn as we go along and at some point help contribute. For those interested, there is also a python wrapper. So it seems to me that it could be done.
The Python project and the JavaScript project do not appear to be related.
As far as I know (and there’s a lot I don’t know) there isn’t a library in Julia yet for calling Node. I believe one of the reasons the Python bindings work so well is that Python has a C extension API that Julia can use directly. As far as I’m aware, there’s no analogous API for node.
Before we can start wrapping JavaScript packages, someone has to do the non-trivial work of making some kind of interface between the two runtimes, and my guess it that it would need to be based on some kind of RPC, and therefore be a lot less efficient than what is possible with Python. (edit: but there might be a way to wrap blink…)
And, speaking candidly, I believe the wider berth we give the Node ecosystem, the happier we shall be in the long run. It’s not a horse I’m keen to hitch my wagon to.
I’m going to go out on a limb on this one and say that, I don’t think creating a Node binding library is worth your time and effort just to get blessed working in Julia. You’d be better off just doing a straight port of blessed to Julia. Why do I say this?
Node is a large runtime, taking about the same amount of memory as Julia while just doing nothing. A TUI should not take hundreds of MB of RAM just to do basic things (the same goes for Julia, but at least it’s not a lost cause).
The npm ecosystem is, frankly, a giant burning pile of trash, and it would be best to avoid having Julia packages rely on it unless absolutely necessary (like in the case of Atom).
Speaking mostly for myself personally, but when looking for a TUI library for use in one of my projects, if I find that yours requires Node just to load, I’ll be skipping over it entirely.
TUIs aren’t that hard once you understand terminal escape codes. You can also use VT100 for any string-to-character mappings you might need to do, saving you effort in the parsing department.
If you want to add a feature to your package, such as a new widget, you’d need to first write it in JS and then get it accepted into upstream blessed. Blessed hasn’t seen a commit in 3 years, so good luck with that…
If you write it in native Julia, I would be happy to contribute to it. I spend my entire day staring at terminals, and am quite fond of TUI interfaces, and also have experience writing them in Julia.
Thanks, I guess its currently infeasible to create a wrapper library. Actually thinking about it some more, I don’t think why this couldn’t be done in pure Julia. I don’t have the expertise to begin such a project but hopefully one day I can start one (or contribute to one).
No worries, it’s really quite easy once you get started (and I can say the same about Julia itself of course). I actually already have a TUI-focused package called Bento at Samantha.ai / Bento.jl · GitLab . I’d be happy to do the work to pull the plain TUI stuff out of it as a separate, reusable package. I’ll leave it to you to decide on a name if you’d like
Ok, I spent the minimum possible amount of effort to just pull some useful code out of my Bento.jl package: https://github.com/jpsamaroo/Wicked.jl. I picked a name that I almost guarantee has been used by another blessed-type library, so please do make recommendations for another name if you have some good ideas.
For a quick overview:
Wicked relies only on VT100, but for a variety of things, like terminal control code parsing, terminal emulation (text only), and some useful types like Cell
Panel is just a wrapper around a Matrix{Cell}, which is a useful abstraction for passing around “surfaces” between different functions
PanelIO is a wrapper around VT100.ScreenEmulator, which is a simple terminal text emulator and parser
panel_dump! and cell_dump! do the bulk of rendering a Panel to an IO object like stdout/stderr, and are reasonably correct in their implementation from my experience in writing Bento.jl
Note that there is currently no README, no “widgets” or anything like that, and probably plenty of bugs in what functionality does exist. I fully hope that anyone interested will contribute and help make this package into something useful. If not, it’s probably going to end up unmaintained, because I have a million other things on my plate right now.
That said, if other people help out with improving it, I’ll play my role and do what I can as well. I’m also happy to let someone take the package off my hands if they feel up to the challenge, although I’m also happy to keep the maintainer role in the meantime.
P.S. There’s still some more functionality in Bento.jl that I can try to port over, but a lot of it is specific to Bento’s structure and types, so it would be best if this package can be fleshed out and a consistent API defined before I try to do the porting.
I’m not sure I disagree on your first part (while some packages are good). But at least you can call Node (not yet its replacement), [and you could call from it to Julia, i.e. to old versions.]
I see “Blessed is over 16,000 lines of code and terminal goodness” so I’m not sure adviced to port, then a wrapper of it or other library might be better.
VegaLite.jl is fast (after first plot) and high quality and uses Node, and first plot can be faster than for some alternatives, such as Plots.jl currently (maybe not fair, but also only PyPlot.jl is just a bit faster):
$ ~/julia-1.6-DEV-latest-4e2fb5c72c/bin/julia --startup-file=no -O0 --compile=min
julia> @time using VegaLite
4.583125 seconds (12.13 M allocations: 611.186 MiB, 8.49% gc time)