Oh, ok. Then I agree. But a script does not become a ‘Package’ (that was the term you used) just because it has *.toml files in the same folder. You can just have zipped folder with a script.jl file and Project/Manifest.toml files, and this script defines no Module nor does the folder follow any of the Package hierarchy (src, test, etc…) and it will be as robust as a package (in terms of reproducibility).
The distinction I want to make is: I think is okay to use Julia without transforming everything write into a package, but it is much harder (and I see very little reason to) not make every little code you write into a Project. I make projects to just answer questions with MWEs in the Discourse, because it is so easy, and because I am sure it will not mess/interact with the environment/packages/stuff for my PhD experiments or previous MWE I had downloaded before. It is just so clean. A experience I had not with other languages/environments.
I think you have a great idea, I use Emacs in daemon mode every day, and it is a good option, in my opinion. Now I am too busy preparing exams, but later I would program it very briefly to prove the concept.
May be a little off-topic, but using Julia to write cron scripts or any other sort of automated or scheduled calculations is actually surprisingly efficient. It may be obvious, but I have had a mental block that “julia is not suitable for scripting because of startup times” and avoided writing cron jobs for a long time. But suddenly I realized that for this sort of tasks (daily web crawling mostly in my case) scripts are fine, since you can ignore startup times in such a case. What I am trying to say is that scripting can be different and “time to first” problem affects only interactive scripts, but there are many others applications, where Julia usage is just fine.
I routinely use cron to run backups that sometimes run for tens of hours (when someone plops a big directory in the shared area). Burning even several minutes to precompile something would be rounding error by comparison.
Of course if you have a cron job that runs every 5 minutes it’s a different story.
I think that’s overstating. I checked and even --compile=min seemed at least 3x faster for for loops than bash (while slower than Python’s). It’s just the startup time of Julia itself you have to overcome. Julia has 195 ms startup, not a lot, but bash has 4 ms startup (a bit faster than perl’s, making Julia 48x slower to start, but only 5x slower than the 38 ms of Python 3), and you need a loop count a bit under 30000 to overcome Julia’s slow start.
You could say, that’s not realistic for a script, maybe, maybe not, but the default setting in Julia needed a bit higher loop count to break even.
I would say we need (faster startup) and Tiered Compilation, like C# has (for no “drawbacks” experience Microsoft promises); since last September “.NET Core 3.0, it is enabled by default.”
Enter Tiered Compilation. With Tiered Compilation, based on the usage of a method, it can get jitted more than once and can be hot-swapped at runtime.
The default experience for most .NET 5 workloads will be using the JIT-based CoreCLR runtime. The two notable exceptions are iOS and client-side Blazor (web assembly) since both require ahead-of-time (AOT) native compilation. […]
AOT-compiled apps can run efficiently in small places, and trades throughput for startup if needed. […]
There are two types of AOT solutions:
solutions that require 100% AOT compilation.
solutions where most code is AOT-compiled but where a JIT or interpreter is available and used for code patterns that are not friendly to AOT (like generics).
The Mono AOT supports both cases. The first type of AOT is required by Apple for iOS and some game consoles, typically for security reasons. The second is the preferred choice since it offers the benefits of AOT without any of its drawbacks.
CPU time is cheap, programmer time is expensive. For 99% of custom tasks, using a language one is familiar with makes sense.
Like @Skoffer, I also use Julia for some regularly run scripts. Being able to do CI on toy data and a reproducible environment are invaluable. Finding out that a task you meant to run during the night and have results when you wake up actually errored for some silly reason is much more costly than even a few hours of CPU time. YMMV.
this is the problem of using Julia outside of the REPL for scripting around. My small bash scripts for automating things are finished long time ago then Julia came for a warm-up session. Julia is good if you have long running loops and calculating for simulations…
I don’t quite understand why the “scripting” cannot be done from within Julia rather than outside of it? I can see how running bash and firing up julia instances can be suboptimal, but I have successfully performed traditional scripting tasks in Julia. Much better than bash, imo.
I think it’s fair. I don’t think there will ever be a world where users won’t have to know a little bash. It’s sort of the lingua franca in programming. If you want a script to be as widely usable as possible, it should be callable from the shell
OK, but then why is this a problem with the language?
It’s not. My OP asked if if were possible to use julia for this case now
(which it sort of is, with compile=min) and if this use case was in the
target space for the language eventually (there is evidence that it is).
If the answer had been “no, that’s not what julia is for” it would be
fine of course, I would just have to look elsewhere for my needs.
BTW: my usual scripting tasks (prior to Julia) were of a nature where whether they ran for 10 seconds or 30 seconds did not make a difference. It usually took me much longer to come up with the correct script (in bash) and saved me tons of time anyway. Just my 2cw.
(I myself use bash scripts frequently, so don’t take this as bashing bash).
But this is actually a slightly self-defeating argument.
*If your script takes long, then the startup of julia doesn’t matter,
*If your script is so quick (<50ms), then your system should be idle the whole time and the few extra ms should not make a difference, unless your calling it very frequently.
*If your calling it very frequently then you can consider doing the higher looping script in julia and then again the startup doesn’t matter. Then you even win on launching bash. (You can still keep the julia code modular and reusable in separate files, while with bash functions it is not as easy as far as I recall).
I don’t take that as a criticism
Everybody can take what he personally thinks is right. I like Julia as a language but only within Juno or REPL, not for something you compile quickly and show someone on another machine without Julia there.
Sort of… If you do using X and it takes 4 minutes to precompile and the script is going to be used once to do something like grab a bunch of data files off the web and create some very simple summaries, and bash + some awk or whatever could do it in 800ms then Julia obviously loses.
Julia only wins when the script is reused multiple times, like maybe it grabs those data files daily and calculates some updates, and only on the first day does it need the precompile.
Julia only wins when the script is reused multiple times, like maybe it
grabs those data files daily and calculates some updates, and only on the
first day does it need the precompile.
In my experience so far, without compile=min I pay tens of seconds of
compile time on every run of the script, not just the first run.
If it was just the first run that was slow and then fast after that (next
day, next week, etc), that would be fine for almost all scenarios (except
throwaway scripts, as you mention)