I am testing Julia langage for critical applications for which our codes do not have the right to bug. So, together with my practical tests, I would like to find some general informations coming from universities and national labs concerning the powerfull, robustness and durability of Julia langage for HPC and big data analysis.
Performance, “no right to bug” and high coding productivity are somewhat in conflict. Julia optimizes for performance and high coding productivity.
For “no right to bug critical applications”, I’d recommend to stay away from julia (“critical application” = crash/bugs kill people). You definitely want a far older (= battle-tested) and simpler compiler / runtime / language, probably want a lot of static analysis and static type-checking and possibly want proof-carrying code / formal verification.
After lurking on the julia bugtracker for some time, I would definitely refuse to bet lives on the julia runtime not crashing due to some missed gc root, subtle subtyping or subtle llvm bug triggered by whatever you are coding. Julia is far too complex, too new and too optimized for performance and programmer convenience for that. That being said, I never worked in high-reliability contexts, so I can’t comment on what tools are used in the industry.
For “no right to bug critical applications”, I’d recommend to stay away from julia (“critical application” = crash/bugs kill people). You definitely want a far older…
I believe an early version of Julia is used for an aircraft avoidance system; i.e. the advantages of Julia seemed to overcome concern about language maturity. Edit: More info here.
… and possibly want proof-carrying code / formal verification.
My understanding is that one could build a formal verification system on top of Julia. The lack of a tool such as ACL2 for Julia is not a problem with the language itself, rather with the lack of tools built on it.
Just noting that afaiu they use julia as a prototyping and pseudo-code specification language, and no critical application (like an actual aircraft) is intended to ever run julia code. Instead Airbus and Boing are supposed to re-implement and verify the (readable and performant to simulate) reference implementation using their favorite high-reliability development process and language that they use for their critical on-board aircraft systems.
This is responsible use, imo awesome, and not a “no bugs allowed critical application”: If the julia runtime crashes under weird circumstances, or you trigger bugs in it, then no actual aircraft falls from the sky; instead some engineer has a bad day bug-hunting during development and testing of the actually deployed critical system.
Quoting Robert Moss from a talk about the next-gen TCAS: “There is some talk on this. The question is what about having Julia not just be the reference but to fly on the aircraft themselves. It’s not a crazy thought. I think Julia would have to go through a rigorous evaluation process, and there would have to be a stable release. Right now this is not what we’re pushing.
I’d like to see that of course.”
It would be interesting to hear an update on this. Maybe an invited talk at JuliaCon?
I hear Ada is used, and a buddy of mine who used to work oh fighter jets says he used to use Ada. There is also a subset of Ada called spark used for formal verification. There’s even a fairly active thread in rust discourse about Ada, tldr; Rust is not good enough to replace many of these high reliability environments…
Also, there’s a difference between life-death side effects bugs and losing lots of money or time due to bugs. I’d say if you literally cannot have bugs you’ll be looking at a very niche set of programming languages and/or a very select group of developers
I agree with your assessment; but I think this discussion is developing from a question that the original poster did not fully understand.
No one wants to have bugs, but actually ensuring that with any reliability is extremely costly. An organization with that goal in mind would not send someone to find out how by asking fairly random and unrelated questions (data analysis in astrophysics is not required to be bug free in the sense of, say, TCAS) following a recent scientific discovery.
What is it about ada which makes it particularly useful for this purpose? Can the Julia language inherit those features too in the long run? Another formalistic language I want Julia to learn from is agda.
Programming in pure type theory involves a lot of tedious and repetitive proofs, and Agda has no support for tactics. Instead, Agda has support for automation via reflection. The reflection mechanism allows one to quote program fragments into – or unquote them from – the abstract syntax tree. The way reflection is used is similar to the way Template Haskell works.
Reflection is also what Julia metaprogramming is based on, so extend the type system of Julia?
No, and hopefully no one in their right mind is even thinking about this.
Imagine a spectrum where you trade off convenience and flexibility for catching errors at compile time. On this spectrum, Julia and Ada are very far away, and you will find languages like C++, C, and Fortran in between (and of course, more exotic ones too).
Julia was explicitly designed to be dynamic, convenient, and fast, with the compiler there to help you by making your code run fast if you cooperate with it, but rarely enforcing anything at compile time. In fact, apart from trivial syntactic mistakes, the effort of “getting your program to compile” is mostly unknown to Julia users.
IMO, a number of things, among them that Ada has been used in this domain for a while, and now has SPARK.
I don’t think Julia is well suited for this domain, nor do I think it would be worth the effort to try to make it so. Julia already has a nice focus, scientific computing, and fairly broad scope outside of that focus. If the developers were looking for more challenges, then embedded systems would be my suggestion, since one would like to deploy trained machine learning models there and I suspect (but have not done the experiment) that Julia is currently not suited for tiny embedded systems. Of course, I’d love to be wrong about that.