Gradual Julia-ization of Python libraries

I believe this is something heavily discussed in TongYuan between @JohnnyChen94 and me. We have engaged in some local open-source activities and have considered quite a lot about user-side adoption of Julia-based scientific computing frameworks (this is necessary if we want to make the use of Julia grow among the target users and make the company success).

We have also tried a lot.

JNumPy is very similar to what @kdheepak assumed about the so-called “PythonLimitedApi.jl”. Integrating PackageCompiler.jl can be awesome but we previously do not consider this, because loading multiple shared libraries that are all created by PackageCompiler.jl is a challenge.

module example

using TyPython
using TyPython.CPython

@export_py function mat_mul(a::AbstractArray, b::AbstractArray)::Array
    return a * b
end

function init()
    @export_pymodule _example begin
        jl_mat_mul = Pyfunc(mat_mul)
    end
end

# the following code is optional,
# but makes Python code loading much faster since the second time.
precompile(init, ())

end

What’s more about PackageCompiler.jl: Chen and I have thought that we might provide a battled-included build of PackageCompiler.jl where we can include a large number of Julia libraries for regular scientic computing scenarios, and finally export well-designed C functions for Python integration. This looks very promising for commercial products, and I feel sad that Syslab (TongYuan’s scientific computing IDE product based on Julia and VSCode) didn’t take this approach (it seems that it’s a bit late). However, this approach is still limited because Julia applications cannot be modular, running multiple Julia sysimages in a single process is pretty problematic. Chen said he was willing to solve things by inter-process solutions (like containers, local/remote services, etc.)

SyslabCC is an AOT compiler developed by TongYuan, and it’s highly welcome for you to have collaborations with us. This AOT compiler is capable of producing libraries for integrating with languages such as Python. We have recently supported C++ code generation and active/passive multi-threading use (e.g., Threads.@threads for-loops). The performance is competitive, at least not notably slower than native Julia for compuation tasks.

You can run the benchmarks if you have Syslab community version installed, the source code is located at JuliaCon2024-JuliaAOT/benchmarks at main · Suzhou-Tongyuan/JuliaCon2024-JuliaAOT (github.com).

We are currently using our AOT compiler to write simulation algorithms and integrate them in Sysplorer, system multi-disciplinary modeling and simulation environment product in TongYuan (I believe the company actually earns money with this). The combination of high performance and rich ecosystem is a game changer for this area, as for all algorithms we wrote with Julia AOT are far faster than some traditional “established solution” for simulation.

10 Likes

I think this is the right question, and the answer is that Julia solves the two-language problem where the two languages offer complementary technical benefits. Having a two-language situation between Julia and Python is a different situation related to popularity, ecosystem, and legacy.

We write more about this in [2406.03677] Advancing The Robotics Software Development Experience: Bridging Julia's Performance and Python's Ecosystem

Python and C++ serve as the foundational languages in robotics programming, each providing unique advantages—Python for its ergonomics and C++ for computational efficiency. This duality has led to prevalent multi-language architectures where C++ handles core computations and Python is used for higher-level integration via wrappers. The Python wrappers are then used for integration (or “glue”) code, as well as interactive exploration and prototyping. This general approach of integrating a fast compiled language and an ergonomic scripting language has been the bedrock of technical computing for decades, largely popularized by MATLAB providing wrappers to linear algebra routines written in FORTRAN. Performance in this paradigm is achieved by “vectorizing the code”, ensures that the performance-critical loops are in the fast language (in computational “kernels”), rather than in the slow host language. Many robotics libraries follow this paradigm, including Drake~\cite{drake}, GTSAM~\cite{dellaert2012factor}, OpenCV~\cite{opencv_library}, Open3D~\cite{Zhou2018}.

The Robot Operating System (ROS)~\cite{ros2} exemplifies another approach, facilitating Python and C++ interoperation by relying on an interface definition language (IDL) to generate code in both languages. The distributed system approach offers other advantages, such as reducing coupling and promoting scalability, and it aligns with the microservices architecture popular in software engineering more broadly. Generating C/C++ code that performs substantial computation (as opposed to just serving as an interface) is yet another approach taken by some libraries and systems \cite{symforce,open2020,mattingley2012cvxgen}.

These traditional Python-C++ architectures, while robust, come with their own set of challenges, particularly in terms of development experience, system complexity, and performance overhead. This context sets the stage for considering alternative approaches that might streamline development without sacrificing run-time performance. Julia, designed for both high performance and ease of use~\cite{julia_fresh}, emerges as a promising alternative.
Despite its potential, Julia’s current ecosystem limitations, such as the absence of a ROS 2 client library, pose challenges for its adoption in robotics. We propose an architecture that leverages the mature ecosystem of Python alongside Julia, reducing the reliance on C++. This approach maintains a two-language paradigm but shifts the focus from complementary technical benefits to leveraging social and ecosystem advantages. Moreover, the compatibility between Julia and Python, facilitated by features like interactivity and the characteristics of dynamic typing, minimizes the development mismatch. We further explore Julia’s application in robotics, illustrating its integration with Python.

6 Likes

I am not sure that extending the concept of the “two language problem” in this direction is useful.

From your article, it is my impression that Julia could jolly well handle the whole stack you (except maybe for some low-level glue stuff best handled in C), and the only reason people stick to Python is avoiding learning something new, which you euphemistically describe as “avoding disrupting the existing workflow”.

That said, reliable language interop is a must for transitioning large codebases, so they can be converted piecemeal. But once all the relevant calculations have been migrated to well-tested and cleanly written Julia libraries, it makes little sense to keep a thin veneer of Python over it.

3 Likes

Could you explain more what this means? I’m reading this as multiple Julia processes with different sysimages sharing a core, but that didn’t feel right.

I believe the same post contains the explanation:

So, basically, a single process can’t depend on multiple Julia libraries, in-process, via FFI, it seems.

2 Likes

Yes, I agree that Julia could handle the whole stack, but in practice many aspects of robotics stack only have Python and C++ support. ROS 2, the main example we used in the article, does not and will not in the near future have a native Julia “ROS Client Library” that doesn’t rely on PythonCall.jl. There is a lot of type and code generation stuff, all in Python and C++.

Another example, we don’t mention in the article, is GitHub - boston-dynamics/spot-sdk: Spot SDK repo which is one of two interfaces (the other being GitHub - boston-dynamics/spot-cpp-sdk, which as you can tell from the name… is second-class) that exist for The Boston Dynamics Spot robots.

The underlying communication with the robot is some variant of gRPC Protobuf. Julia has some gRPC client library, so if someone figured out how to do that, handling whatever bespoke stuff might be around it, like authentication.

and the only reason people stick to Python is avoiding learning something new, which you euphemistically describe as “avoding disrupting the existing workflow”

Well, yes and no. It’s not just people want to avoid learning something new, (which I think we have to acknowledge does take effort. My coworkers don’t want to avoid learning anything new. The want to learn new things about the things they care about. ). They primarily want to avoid writing more code. There are sorts of code that fall under the category of “some low-level glue stuff”. It’s a matter of perspective.

But once all the relevant calculations have been migrated to well-tested and cleanly written Julia libraries,

I don’t think the article does a good job of touching on this at all, but one thing that I like to try to communicate to people about Julia is “clean”, but I don’t think I ever say “clean”. I try to convey that the package ecosystem has much smaller pieces than the pieces in other languages, because composition is easier. (If there is time, I’ll mention The Lisp Curse). I think the benefits of an all-julia stack are clear for composability, and it is fun to tease Python for lacking this really basic kind of composability, but also is a small example of how “pervasive multiple dispatch” scratches a very important itch. For example: Is Julia's way of OOP superior to C++/Python? Why Julia doesn't use class-based OOP? - #92 by goretkin

Lit makes little sense to keep a thin veneer of Python over it.

The thinner the veneer the better, because language interop is inherently complex and complicated (though really I think language interop between Julia and Python is MUCH better than probably any other combination I have experienced. This is what we meant by “existing workflow”. It’s really a euphemism for “dynamic type languages play together nicely, more than python and c++ do”). And if with little effort you don’t need the veneer at all, then great, get rid of it. But if there is effort in scraping it off, and it truly is just veneer, then I don’t think it is necessarily worth it to get rid of it.

The truth is that it’s not just veneer. It is substantial complexity to have language interop.

3 Likes

I think some of this is due to the Matthew effect: Matthew effect - Wikipedia. Some workflows are only possible in Python, so devs start using Python, and each ecosystem sees compounded growth. (Which for an analogous reason is partly why Julia has seen such impressive growth in scientific simulation!)

One example of a Python-only functionality is efficient multi-node training of very large neural networks with DeepSpeed or FSDP. If you want to train a large language model, you need to use a tool like this, and such an ecosystem simply does not yet exist in Julia. And thus, PyTorch will accumulate more users and contributions, who will further add more features, and so on.

But the other big reason is legacy codebases. Even if a tool is technically superior in every way, developers of large codebases will not use it unless it easily fits into their existing stack. So the entry barrier – without something like PythonCall.jl – prevents adoption.


Anyways, sorry to derail things, maybe we should get back on track a bit… I think PythonCall.jl is already seeing a lot of love which is great. I’m very interested in hearing other ideas for what a simplified Python/Julia integration could look like! I already really like some of the ideas in this thread.

12 Likes