today instead of doing my work, I learned about the new artifact system, and I am absolutely fascinated.
I am currently working on a project that uses some data files, which I store in my Onedrive for Business folder. My module containing all functions also lives in my Onedrive, so I can use them by having a relative path between them, which is clunky but works.
However, the computations are quite expensive, so I intend to move to a server soon, which raises the question of how to setup everything that I need.
My thought just now was to use the new artifact system, add all my data files as artifacts, and lazily load them as I need them. I already saw that I can put the recipes for BinaryBuilder into my own private repo, as my modules already are. So I only need to setup ssh/deploy keys on the server to able to clone easily from github.
My questions however are thus:
- Where to store my data and how to securely access it:
The files are already on Onedrive, can I just put the Onedrive download link in the Artifact.toml? Can I maybe even include some authentication so that not anyone with the link can just download from my drive? The data is not really cleared for general release.
- Is this even a good idea?
Given that time is always short, and fast, hacky results are rather sometimes more appreciated than slow, elegant ones I worried whether this makes sense to do. I would hope that once I learn how the artifact system works it will have been worth it, because it will be very fast with all future projects.
I apologize for asking without trying myself, but time is really short right now and if the system is not quite there yet, I may wait for now.
I suppose, once I talk about authentication with Onedrive, I can just have a download script that gets everything from Onedrive. However, I like the idea of having a nice, reproducible workflow, where I clone my packages on some server, run my script, and the data as loaded as needed.
Any thoughts on this are highly appreciated.