[ANN] SimplePlots.jl – Interactive Plots in 4 seconds or your money back

This is in the pipeline! Just thought I’d start with UnicodePlots for the REPL.

// the terminal plots also help a lot with testing

1 Like

This package can’t do 99% of the things Plots.jl can do, so it’s just a trade-off. If this gets recipes support and 3D, then it’ll be on par, but I think part of the issue with Plots compile time is the recipes handling chain (though it could probably be done better)


I got these times on a macbook pro and actually rounded up? You can see them in this notebook:

// if i remember correctly, windows is slower when they’re are a lot of included files? maybe that’s related?

edit: the reason for the slower load is actually b/c you’re doin it through the REPL. This loads UnicodePlots, which is kinda slow. Try it in a Jupyter Notebook!

1 Like

I want my money back, it’s 8 seconds on my i5 MacBook Pro :wink:

julia> @time using SimplePlots
  7.831200 seconds (13.15 M allocations: 636.978 MiB, 4.05% gc time)

julia> @time scatter(
         rand(10), label="Scatter", color=4,
         xlabel="x", markersize=6
  0.398590 seconds (732.47 k allocations: 36.794 MiB)

I certainly like the idea to have a fast plotting package and honestly I am perfectly fine with PGFPlotsX and UnicodePlots.

However, I think that the interactivity of SimplePlots is a nice feature.

What about histograms? A particle physicist ist lost without histograms :laughing:


Is this the initial compile time? Cause I’ve been measuring from a precompiled state?

And for a fair comparison, what would be the @time's of doing the same action on just Plots.jl (or Interact)? – this might just be an inevitability of using Julia

Also, will be adding histograms and heatmaps soon!

// lines and scatters just seemed like the place to start for a simple plotting pkg

You can have your money back tho :joy_cat:. The customer is indeed always right. Feel free to post issues (with MWE’s) on github too! Big big fan of happy users

1 Like

(edit: all times are from the precompiled packages)

julia> @time using SimplePlots
  9.855101 seconds (13.16 M allocations: 637.564 MiB, 3.74% gc time)

However, there is a huge difference between the REPL and Jupyter and I think this should be emphasised. SimplePlots takes only 1.8s when I load it in a Jupyter session (compared to up to 10 seconds in the REPL):

Btw. Here is Plots.jl (REPL):

julia> @time using Plots
 12.610763 seconds (17.64 M allocations: 954.746 MiB, 3.46% gc time)

and UnicodePlots (REPL of course):

julia> @time using UnicodePlots
  5.658824 seconds (4.97 M allocations: 242.752 MiB, 1.36% gc time)

…and my favourite: PGFPlotsX which has the most complete feature-set I know of (thanks to tikz and pgf):

julia> @time using PGFPlotsX (also REPL)
  2.549934 seconds (3.06 M allocations: 151.271 MiB, 2.06% gc time)

I don’t want to be too pessimistic, but I fear that adding more features will end up with the usual problems, however it seems that you only need some light wrappers around plotly.js so that might be very lightweight! Btw. if I would work on a daily basis with Plots.jl (as said, I work full-time with PGFPlotsX and it’s darn fast), I’d definitelly compile it into the system image, which means it can be loaded in less than a second, but this has other downsides too.

However, I would like to stress out that this comparison is kind of lacking a major thing: interactivity. So if you want to get an interactive plot within a few seconds, it’s definitely something! :slight_smile:

1 Like

In Julia 1.5 and the trick published by Jeff

if isdefined(Base, :Experimental) && isdefined(Base.Experimental, Symbol("@optlevel"))
    @eval Base.Experimental.@optlevel 1

I now get this in GMT (not long ago I was still in 14 sec)

julia> @time using GMT
  0.404955 seconds (326.46 k allocations: 24.077 MiB)

julia> @time plot(rand(10,2))
  2.744331 seconds (803.98 k allocations: 42.578 MiB, 0.64% gc time)

Just didn’t get if that Experimental is the final name or not.

1 Like

You should be doing this in a Jupyter notebook. I don’t think measuring the REPL speed is fair?

// it’s more a novelty than how people really plot (at least from what I’ve seen at the MIT julia lab)

PGFPlots also requires you to learn a new syntax. This is literally swapping out:

using Plots
using Interact


using SimplePlots

And I’ve never heard of Base.Experimental before?

Is this something a new person to Julia should be aware of?

I saw it in some forum post and decided to try. Apparently Plots is using this too (in master?)

Found the post

1 Like

It wasn’t meant too serious.

No, I didn’t do any real timing measurements, just counted seconds. Do be fair, using Plots takes a lot more time to precompile (> ~40seconds, precompiling NOT the first time, SimplePlots.jl the first time: 8sec, second time ~4sec).
So Plots.jl is one of the packages with exceptionally long waiting time until go. Therefore precompiling under 10 seconds is, in my experience, very good and in the average of the normal precompile time.

Again: I like the package and it is quite usefull, because about 15 of 20 plots I like to have a quick view on the plot, and only a few times I am working on a final plot for sharing or publication.

Measuring the compile times now in detail is, in my opinion, not so usefull. A compile time of 4sec or 8 sec doesn’t matter for me. Compile time of 40sec is sometimes painfull, but for me, in most cases, I just wait and think about what I want to do next. Contemplating.

I am sorry If my initial post has been taken serious, what I haven’t intended, but now I see, that it was to be expected. I appologize!

I just updated my comment above since it really makes a huge difference whether you load it in the REPL or in Jupyter (<2s vs. ~8s on a 5 years old MacBook pro)

I also want to apologise for bringing in some negativity in my post above, being a bit pessimistic, which is definitely not what you would like to see in an ANN post and was not my intend. I just appended my personal preference and opinion; and I am heavily biased due to my love to PGF/tikz and their benefits :see_no_evil:

I highly appreciate this package which definitely fills a niche in rapid interactive plotting!


[My GUI plots from Julia are down to 281.4 ms including startup, with Gaston.jl, but below is on Julia-only packages.]

I actually got a bit over 6 sec. for only @time using SimplePlots (after precompiling) but that’s more than 3x slower than it needs to be. That is with non-default settings and UnicodePlots alone almost 8x faster (or gnuplot 19x times), and this can be achieved partially with a PR to the package, while --compile=min can’t currently be added to the package itself, and I really want selective optimization for that too, similar to Jeff’s trick, until then consider:

$ alias julia="~/julia-1.6.0-DEV-8f512f3f6d/bin/julia -O0 --compile=min --startup-file=no"

or similar. The fast startup PR to the package, would work for 1.5 and later, and fastest possible would work for 1.4 too with above settings.

With speed of plot down to 0.77 sec. (for UnicodePlots.jl; and for 0.3 sec Gaston.jl that uses preinstalled gnuplot) with all (startup) overhead, I consider “time-to-first plot” not in important issue. The setting -O1 is almost as fast (or the same, my machine was a bit loaded with 29 GB in use), so in general I would probably recommend that for interactive use.

$ ~/julia-1.6.0-DEV-8f512f3f6d/bin/julia -O0 --compile=min --startup-file=no

julia> @time using SimplePlots
  1.858364 seconds (2.79 M allocations: 151.739 MiB, 1.99% gc time)

julia> @time plt = plot([-1, 2, 3, 7], [-1, 2, 9, 4])
  0.019696 seconds (9.91 k allocations: 648.372 KiB)

And next plot is also fast at this lowest (undocumented) compile setting:

julia> @time plt = plot([-1, 2, 3, 7], [-1, 2, 9, 4], title = "Example Plot", name = "my line", xlabel = "x", ylabel = "y")
  0.088250 seconds (44.32 k allocations: 996.578 KiB)

$ hyperfine 'julia-1.4 -O0 --compile=min --startup-file=no -e "using UnicodePlots; plt = plot([-1, 2, 3, 7], [-1, 2, 9, 4])"'
Benchmark #1: julia-1.4 -O0 --compile=min --startup-file=no -e "using UnicodePlots; plt = plot([-1, 2, 3, 7], [-1, 2, 9, 4])"
  Time (mean ± σ):     839.0 ms ±  36.5 ms    [User: 1.238 s, System: 0.496 s]
  Range (min … max):   773.4 ms … 885.8 ms    10 runs

$ hyperfine 'julia-1.6.0-DEV -O0 --compile=min --startup-file=no -e "using SimplePlots; plt = plot([-1, 2, 3, 7], [-1, 2, 9, 4])"'
Benchmark #1: julia-1.6.0-DEV -O0 --compile=min --startup-file=no -e "using SimplePlots; plt = plot([-1, 2, 3, 7], [-1, 2, 9, 4])"
  Time (mean ± σ):      2.145 s ±  0.047 s    [User: 2.467 s, System: 0.547 s]
  Range (min … max):    2.063 s …  2.228 s    10 runs

But we don’t want to start Julia with -O0 just to make TTFP look low, do we? What about the rest of Julia usage?

1 Like

That’s only half true. It wouldn’t surprise me at all if Julia moves more in the direction of other JIT languages, and run on very low optimization levels for non-performance sensitive code. Getting this right would be hard, but it very well might be easier than getting fast compile times. My guess is in a year or two, Julia will probably be much better at both interpreting code, and having statically compiled code. Packages will largely be precompiled, and non-critical code will be interpreted. This is just wild speculation by a mostly outside observer though, so take it with a heavy grain of salt.


I don’t think that’s far from how it will end up. SnoopCompileBot and invalidation fixes will allow a lot more to precompile, and the module-specific optimization levels are being released in v1.5. Jeff found that with module-specific optimizations, -O1 was better than -O0 anyways, so I think what we’ll see is more basic packages opting into -O1 snoopcompiling enough. v1.6 has a few fixes for things like kwarg splatting coming that I know will increase the amount of good type inference that large packages like DiffEq get, which will further help compile times, and allow pairing with compile snooping a bit better. In the end, I think we’re not too far away from a very comfortable position in terms of compile times, and if you try v1.0 Plots.jl without the precompile file you’ll see just how far we’ve come in just recent advances.


Sorry, just for clarity, this package is meant to be used inside Jupyter notebooks.

It is actually much slower running through the terminal because UnicodePlots is only used in that case.

Inside a notebook is where you get the 4s to first plot

// which would probably be faster with the optimization modes discussed

edit: it also has a more holistic approach to interactive plots using javascript. for example, you can refresh the page and widgets still work. you can hide a trace on a lineplot and it carries over once you drag a slider. yada yada

That’s why the three-line addition to your package (what I was referring to with “can be achieved partially with a PR to the package”), is important (works for 1.5 or later, does nothing on earlier):

But you might be surprised how effective -O0 or -O1 can be, I’ve found a benchmark that benefited from it (so don’t dismiss it until you’ve tried):


There’s not always a trade-off. Sometimes it’s just a win-win, why my PR for JLL packages was finally merged, and they all will use -O0, but I’m told they all first need to get a new version out to trigger code generation.

See: https://github.com/JuliaPackaging/Yggdrasil

Anyway if you want fast plotting with default settings:

(@v1.6) pkg> add GR#master   # there's a performance regression for using for latest version, and I filed an issue, and it's fixed on master

julia> @time using GR
  0.221278 seconds (68.59 k allocations: 5.280 MiB)  # the author claims 0.17 sec, maybe it's with non-default settings

and FYI:
$ julia --startup-file=no -O1

julia> @time using GR
  0.150216 seconds (68.59 k allocations: 5.280 MiB)

The non-first plot shouldn’t be slower, even with -O1, as the heavy lifting is done by a native code precompiled library, and the same would apply for all JLL packages. My PR (at BinaryBuilder.jl) to default to -O0 for JLL packages, was finally merged to speed up loading (using) those packages. But the rest of your code will be affected by non-default settings used (globally or locally in modules, as of 1.5). With more JLL packages, we’re up to 441 JLL package, we’ll have more of a Python feeling, best of both worlds:

E.g. already got the treatment:

julia> @time using LibPQ_jll
  0.404593 seconds (366.31 k allocations: 21.175 MiB, 2.87% gc time)

still the main package (as it as other slow dependencies, not yet JLL):

julia> @time using LibPQ
  9.215787 seconds (16.41 M allocations: 834.362 MiB, 4.51% gc time)

with -O0:

julia> @time using LibPQ
  6.377934 seconds (16.41 M allocations: 834.320 MiB, 6.54% gc time)

For some reason -O0 is also used for Python 3 code, which actually beats the Julia code (we need to look into that):



cross-posted from this issue:

Does the @demo macro work with CompositePlots? I’d like to have some sliders that affect two plots at once but I can’t figure out how to get both of them to show up…see issue linked above for what I’ve tried.

feel like paying $8…

1 Like