Makie is pretty great for plotting big datasets, in particular the datashader functionality datashader · Makie
But in the case of WGLMakie, you would first have to send the dataset to the client, which could be gigabytes.
Over in the Python world the Datashader + Holoviews + Plotly combination has a different approach: Datashader produces an image which is rendered in Plotly, and Holoviews hooks up Plotly zoom events to rerender the image. Example: https://twitter.com/pepijndevos/status/1509948476586663937?s=19 Docs: Interactivity — Datashader v0.14.4
This way you’re only sending kilobytes of data to the client rather than gigabytes, but you can still zoom in on the data and see increased details.
I think it would definitely be possible and useful to have access to this approach for big data notebooks, but I haven’t quite figured out all the plumbing, so I thought I’d share what I got and maybe someone has better ideas.
Basically I use Makie to generate a static plot, and use show
to turn it into a base64 encoded data URL. I then set this as an image on Plotly, and hook up the plotly_relayout
event to send the plot bounds to Julia which I use to set the limits on the Makie plot.
But I haven’t actually managed to close the loop where Plotly updates with the new Makie plot.
Here is the notebook I’m playing around with: JuliaHub