[ANN] Symbolics.jl: A Modern Computer Algebra System for a Modern Language

OK – but there was also a temporary message (now edited out) that the code had been updated. That is what my response relates to.

Looks like that needs a quick PR

1 Like

It’s interesting that Maxima’s already “available” from Julia in some sense. Maybe the projects that you have in mind to build on top of Symbolics.jl could be specified in Maxima (or built in Maxima!). This would inform you of situations where (a) Maxima or something similar does just fine, or (b) You need to go beyond what Maxima can do already.

The Sage environment may require a virtual machine on windows, but Maxima does not. If all you need in Maxima, there’s a certain disencentive to getting it via Sage.

If symbolic indefinite integration is of special interest (not that it is likely to be directly used in applications …) there are other implementations of the Risch “algorithm”, most notably the one in Axiom/Fricas. There’s also Rubi, (the announcement of Rubi in Julia
was what caught my eye, recently, and why I’m here.)

. The python version seems to have stalled, but I have not followed it recently. I think the time needed to load and parse thousands of Rubi rules was excessive. I don’t know if this has been addressed.

(The corresponding setup for Rubi in Lisp would be to translate Rubi rules to Lisp. Loading Lisp is pretty fast, but for some Lisp it is also possible to convert it into FASL files which load even faster.
Or it could be made part of a binary coredump.)


These projects can’t be specified directly in Maxima or built in Maxima. That would break abstract interpretation of Julia code, which is one of the core features we need for some of the ModelingToolkit features. That said, there’s nothing that stops Symbolics.jl from making it easy to call a few algorithms from Maxima like we do from AbstractAlgebra.jl/Nemo.jl.

Maxima would be easier to get to given it’s already wrapped. What would be the reasons to additionally wrap Axiom/Fricas, i.e. what do they do better than Maxima?

Yeah, SymPy Rubi is apparently really good, outperforming Maxima (not in compute speed but in “integrals it can solve”) after adding it:

But it’s still in a PR because Python is too slow. Definitely a case for Julia to step in.


I created a (nearly empty) package Rubi.jl: GitHub - ufechner7/Rubi.jl: Symbolic, rule based integration in Julia

If you want to contribute, please add your ideas to: Discussions · ufechner7/Rubi.jl · GitHub


I am the original author of SymPy and SymEngine. Python is indeed too slow, so it is hard for SymPy to compete in terms of speed, but it is still very useful to a lot of people and there is a large community around it. The SymEngine project is written in C++ and it is very fast (a few years ago it has been generally faster than any other computer algebra system that we tried, including Mathematica, Maple, GiNaC, Sage, SymPy, Maxima, etc.). @ChrisRackauckas have you run some benchmarks against SymEngine?

Furthermore, our vision with SymEngine is to be the “engine” that can be used in higher level languages such as Python, Ruby or Julia (it has wrappers to all those and more). That way we can all collaborate on a common codebase that can be natively used in all our languages of choice.

I noticed you wrote:

But as we expanded beyond “build Jacobian and simplify”, SymEngine wasn’t enough. It was missing too many features.

Indeed, SymEngine is missing features compared to SymPy. We were hoping you would be interested in working with us to contribute them, especially since you were already using it. :slight_smile: Because if you do, then not just Julia, but every language including Python and SymPy would benefit.

It is easier for the Julia community to contribute if the code is written in Julia, just like it is easier for the Python community to contribute to SymPy since the code is in Python. That is true and growing the developer community is important.

But I believe it is important to write libraries that can be used from any language. C++ is one of the options for such a common language. Last time I tried Julia’s ahead-of-time compilation to static executables was not yet as easy as with C++ or Fortran:


Hi @certik! Of course I know who you are :laughing:, though I know you and @isuruf are probably sad to read about this.

SymEngine is pretty fast. The release doesn’t mention performance qualms with SymEngine, only SymPy. The only performance qualms I have with SymEngine is the fact that specialized tools like FLINT can beat it in their specific regimes and we want to specialize the calls for that, but that’s relatively minor. It’s more features and the interaction with Julia’s expressions that came into play here.

We did contribute to the SymEngine community for years (community more than code), and it was the preferred symbolic underbelly of SciML through ParameterizedFunctions.jl for that a very long time. But there were some issues. One is that we just kept growing, and as our projects kept growing they needed more customization. Our symbols need arbitrary metadata to store information about types, flowness, units, etc. for doing context pushing in ModelingToolkit (I’m sure adding a boolean flow to the symbol type in SymEngine’s C++ core would be a non-starter, but this kind of thing was crucial for us). I don’t know other CAS’s which allow arbitrary customization of carried metadata from Julia (which would also be an issue with Axiom, Maxima, etc.). We also need array symbolics, i.e. not arrays of symbolic variables but variables which represent arrays and act with non-commutative rules. So SymEngine is still probably a better CAS right now, but it isn’t a better CAS for ModelingToolkit.jl, which has been a big driver to this project.

I agree we could in theory contribute back to a C++ core, but the time cost of doing that in a lot of cases is rather high. It was easy to prototype some fixes in ModelingToolkit and then plan to add them back to SymEngine, but quickly we learned that no one ever got the energy. That’s to be expected though: there really isn’t a feature benefit to our projects to spend that time. So over time there was a drift until MTK essentially had a CAS, in which case @shashi stepped in and wrote a similar-in-style rewriting system with similar specializations, as a launching pad to start playing around with new ideas in the area. We integrated tools from AbstractAlgebra.jl (i.e. parts of the Nemo CAS), created a metadata system, parallelized the internal operations with task-based parallelism, created a very extravagant extension to lambdify known as build_function, and… it would take a huge amount of time to do some of this in C++!

But maybe what we learn can be helpful for SymEngine! I think there will be some healthy cross-pollination between the projects. I think a joint dev conference day where we just share ideas we found useful could be fun.

Right now our focus is just on making the experience for Julia users as good as possible. I agree that multi-language usage is a noble intention, but it’s not a priority for us. We instead need something that does not compromise to act seamlessly with Julia, and there were always a few little hiccups that were hard to solve by something that wasn’t made to exactly match Julia’s semantics. You’ll see why this “exactly match Julia” such a big deal for us with ModelingToolkit.jl when its paper comes out in 2 days (note the paper is not about CAS’s at all, but it’s built on Symbolics.jl and some important features).


Thanks for the reply @ChrisRackauckas. Just couple points:

  • I am not sad, I am happy for you and for Julia in general. As are most people that I know in the Python community. I am happy our work on SymEngine was an inspiration for you.

  • Yes, let’s collaborate on the ideas, brainstorm, etc.

  • SymEngine can use Flint as well as other specialized libraries and it has code that can convert from the general expression into the specialized one (typically a polynomial of some kind).

  • The issue you pointed out is I think relatively easy to fix, I outlined how to fix it there, but I was hoping somebody who actually uses the Julia wrappers would help. You reported the issue, but never commented further. I personally didn’t have time to fix every issue for every language, that would not scale well. My goal was to get that particular language community to take care of the wrappers. It actually works moderately well overall. As you said, this particular issue even if I fixed it would not change your overall direction.

I think this sums it up. :slight_smile: Python is now one of the most used languages, but not everybody uses Python, so I was always motivated to create libraries that can truly be used by anyone (and to quite a large extent we succeeded with SymEngine, as it is or was used by quite a few people in Julia, although as you explained, ultimately you decided to write a library in Julia itself for many reasons). I think even if Julia becomes more popular than Python, there will still be people who will use other languages.

However, it could be argued that it is the job of the Julia compiler and language itself to allow it to be used from other languages more easily, not of Symbolics.jl. Then Julia could become the “common language”.


Yeah we’re investigating adding a Poly type to the group which dispatches over to AbstractAlgebra.jl. Maybe it’s similar in spirit? We’ll let you know how that all goes.

Take it as a major compliment that you kept our attention for almost 4 years. We’re the power-iest of power users, so we always kind of knew that sooner or later we’d outgrow something that’s wrapped. Hell, the only reason Julia works for us is because we keep a direct line to Julialang/julia and the people who work on the compiler. We’ve been asking and funding all sorts of new features in Julia itself, like the new abstract interpreter, changed parsing of + and *, changes to allowed unicode, etc. I have an ongoing monthly reminder to bug @jeff.bezanson about adding support for types of the form X{T} <: T where T is non-concrete (a feature we want for controlling symbolic dispatching with metadata :wink:). We want the weird stuff, and so the fact that we were satisfied by SymEngine for so long means it’s pretty good. But as we make Symbolics.jl more customized towards Julia (and customize Julia to support it), it will keep getting harder and harder to support everything cross-language with a simple syntax, and it’s hard for us to justify the time there because that’s not our priority.


from one of the current yacas developers


Let me start from a simple explanation - I’ve joined the project when it was already very advanced, and it wasn’t me who actually did majority of work. Nevertheless, having done some work on yacas, and having used yacas in my other projects, I came to some conclusions.

First of all, the language is quite fragile, and could use redesign. There are a couple of issues, but I would consider the scope of symbols to be the worst offender, especially when macros are involved. Also, the pattern matching machinery is not flexible enough.

Another big issue stems from the fact that in the beginning yacas grown very quickly. Thanks to this there is a lot of interesting functionality. But it came at a significant cost - yacas badly lacks good implementation of core mathematical concepts. As a result the more specialized functionality either has some workarounds for the missing bits, or is not general enough. By the core mathematical concepts I mean sets, assumptions and basic algebraic structures.

To conclude - if I were starting yet another computer algebra system now, I’d put much more effort into designing the language (including better pattern matching) and into providing reasonable set of core mathematical concepts before implementing the applicative functionality. And again, with the applicative functionality I’d go slower, making sure to provide at least the right interface before building anything on top.

This is just from the top off my head. I hope that you’ll find this useful. Let me know if you want to know any more details.



This is already possible (see pyjulia and the julia doc for C) and looks reasonably easy, although I don’t have any direct experience with it.

…(trying, responding as email…) (editing. Apparently my email system fills in characters like apostrophe and tab ? from some weird font. Sorry)

These projects can’t be specified directly in Maxima or built in
Maxima. That would break abstract interpretation of Julia code, which
is one of the core features we need for some of the ModelingToolkit

I do not understand this.
Anything at all can be expressed in functional notation with tags like this
The_Integer_Result ( The_Integer_Plus (The_Integer(4), The_Integer(5)))

or the tags can be implicit in the names as

declare([x,y], integer)

I don’t know how this Julia requirement of abstract interpretation
corresponds to Maxima, but perhaps this:

external syntax: x+y
internal representation (which might count as some kind of
intermediate code, but here is just Lisp)…
((mplus) $x $y) … or if simplified ((mplus simp) $x $y)

That said, there’s nothing that stops Symbolics.jl from making it
easy to call a few algorithms from Maxima like we do from

In order to do so you presumably need to have functionality, in Maxima
or in Julia or both…

Convert_from_Julia_to_Maxima (arbitrary_expression_in_Julia)
Convert_from_Maxima_to_Julia(arbitrary_expression_in_Maxima )

The arbitrary Maxima expression might extend beyond simple algebraic
expressions to equations, differential forms,
matrices, function definitions, series, strings, files, …
If there is something that Maxima can represent and Julia cannot, that
is suggestive of an area
to check out. Is this a deliberate omission?

If there is something that Julia can represent, but Maxima cannot now
(or with some handiwork) ever, represent,
that would be interesting to see.


If symbolic indefinite integration is of special interest (not
that it is likely to be directly used in applications) there
are other implementations of the Risch algorithm, most
notably the one in Axiom/Fricas

Maxima would be easier to get to given it’s already wrapped. What
would be the reasons to additionally wrap Axiom/Fricas, i.e. what do
they do better than Maxima?

There are claims that the Risch algorithm in Axiom is superior. The
Axiom world view (categories, types, etc.) may be more supportive of the
Julia world view than other CAS like Maxima, Mathematica, maybe Maple,
Reduce (there was a “mode-Reduce” for a while)


The python version seems to have stalled, but I have not followed
it recently. I think the time needed to load and parse thousands
of Rubi rules was excessive. I don't know if this has been

Yeah, SymPy Rubi is apparently really good, outperforming Maxima (not
in compute speed but in integrals it can solve) after adding it:

Rubi integrator · Issue #12233 · sympy/sympy · GitHub

But it’s still in a PR because Python is too slow. Definitely a case
for Julia to step in.

I think it is pretty clear that Rubi could be stuffed into Maxima if
there were a sufficiently motivated individual
[to pick up where I left off, or start from scratch.] My concern for
the implementations of Rubi
on top of X is that when X is Mathematica, Rubi can rely on various
features – but those features might be missing
from other systems. If the other system doesn’t have (say) sufficient
simplification programs, thenRubi might not really work so well. Note that the decision problem “Is this expression equivalent to 0” is provably unsolvable for a large enough class of expressions, so there is a range of capabilities that might exist in any given system.
This result due to Daniel Richardson, has the consequence that the problem “does this integration problem have a solution in terms of elementary functions” is recursively unsolvable…)

And as I’ve mentioned, symbolic INdefinite integration is not a big
seller. Look at published tables. A very small portion covers indefinite integrals.

Definite integration could be entirely numerical – that is solved by quadrature programs, but
with non-numeric parameters
it becomes interesting symbolically. To do that relying on the
fundamental theorem of calculus
is not a great idea, since the interesting tasks require contour
integration. Paul Wang’s MIT
thesis worried about that. There are subsequent developments that
could be influential, using
clever special functions, but FTOC is not enough. You might not even
be able to tell if FTOC works unless you can satisfactorily analyze the behavior of an arbitrary
expression of several complex variables on a path
of integration.

Another approach to definite integration, sort of Rubi-like, is to take
all the information in (say) the DLMF , digital
library of mathematical functions, and store/retrieve it automatically.
A pointer to an incomplete version of this – my old TILU project.

I assume there are packages (say for ODE solving) that benefit from the
Julia environment, but understanding the dependency would be useful, if
the intent is to become “more symbolic” in the future.


These two are related, this is the big point. The ModelingToolkit.jl paper is out now and explains some of those details:

To me, the main purpose of such a CAS is to support such computations within abstract interpretation, mixing with the language and libraries like ChainRules.jl. That’s where it comes from, and it would be hard for something not embedded within the host language to interop seamlessly in some of these applications. Of course, we can still call out to some other libraries here and there, but this is why it is important that the graph building is really Julia-based.

No, conversions would be fine. We do it with AbstractAlgebra already, we could do it with others where it makes sense. Sounds like using the Axiom Risch is one of those places where it could. Not a default so that the dependencies are low, but an extension axiom_risch(ex) that converts, integrates, and converts back would be interesting to have around.


That’s quite the interesting paper!
(I was a bit surprised though that there is no link to the MTK repository/website, and that the code that was used to compute the results of the paper is not referenced/made available)

Seeing as you seem to be positioning MTK (among other things) as a Modelica competitor: One of the strong attractions of using Modelica (for a simple user/modeler) is the availability of a comprehensive and vetted Standard Library of compatible/standardised components/building blocks. Do you have plans to create something similar for MTK usage? Or maybe some kind of framework to let the community assemble/create a collection of specialised sub-libraries?

Also, how familiar are you with MetaModelica, which also seems to me an effort towards unifying the modelling and the solution/computation domains in one language, like you do with Julia and MTK here. See this recent paper on it that also compares to Julia.

1 Like

I think I’ve seen some mention of a tool (? function named dymola?) which can read Modelica code and convert it to ModelingToolkit code.

An important element in several Modelica tools is a flow sheeting tool (somehow similar to Simulink) which makes it possible to drag and drop model units from a palette/library onto a canvas, and then connect the units graphically. I don’t know if there are plans for something like that in ModelingToolkit, though.

We’re building out a standard library right now. There’s going to be a few announcements on that in a few months. There’s just far too much to write about :laughing:.

They are tutorials in the documentation.



I think that’s a lot more work than just embedding it into Julia and getting Julia’s semantics for free! Plus you lose the whole community: the community of a programming language is very large in comparison to modeling tools. This is also way MTK has Symbolics.jl underneath it: why implement derivatives and simplify and all of that specifically for a modeling language? If you abstract that out to be over a CAS, you can then support a larger project, but grow a larger community around it. Overall you end up getting a lot more “user coverage” finding what works and what doesn’t.


That’s great to hear! As a user, I very much like that I can just fire up OpenModelica/OMEdit, connect a couple of tanks, pipes and pumps, a PID controller, and a state machine together (in a GUI, too) to implement a multi-domain model for an industrial process, where I “just” have to deal with defining the correct (physical) parameters for the involved components.

Sure. I was more thinking that an excited reader (that doesn’t lurk here :sweat_smile:) has to start searching online, or browse referenced papers, instead of clicking a footnote in the paper pointing to mtk.sciml.ai/
Maybe that’s a pet peeve of mine, though – papers on some implementation/package/library, sometimes old and/or obscure, without a link to the source.

Re the code, I was thinking less of the toy/demo problem, and more of the more “realistic” big HVAC model, which surely has more detailed models, probably thermodynamics, maybe media models, and is benchmarked against an Modelica implementation. I thought that those models would be made available, e.g. for the OpenModelica folks to take a stab at and benchmark from “the other side”. Considering the danger of less representative benchmarks in general (especially across tool/lang boundaries), that would surely strengthens the case for MTK, no?

Afaict, OpenModelica’s Modelica compiler is already mainly (fully?) implemented in this (bootstrapped, too). Also, I’m not sure which community is lost here, the Modelica (the language) community is not that small, either, and heavily backed by industry as far as I can tell.
Surely, MetaModelica does not “gain” the Julia community, though, but that is not the aim of MetaModelica. This was more meant as a reference to a similar embedding effort in Modelica. :man_shrugging:

Also, you’re surely correct concerning the wider scope of ML tooling, compatibility with many Julia packages, etc. that MTK enables!

Anyway, I’m glad to see all this! In the end, as a user, I can only win if there is more choice! :smiley:


@JeffreySarnoff seems to have given it a shot, but it resulted in an error in the Threading testset:

   Evaluated: isequal(a*((1 + d)^-1) + (0.6666666666666666 + a)*(c^-1) + (b + (b^2))*(d^-1) + 2.0d*((1 + c)^-1)*(((c^-1) + (d^-1))^-1), a*((1 + d)^-1) + (0.6666666666666666 + a)*(c^-1) + b*(1 + b)*(d^-1) + 2.0d*((1 + c)^-1)*(((c^-1) + (d^-1))^-1))

Some adjustment is needed here, but it’s not clear what.

1 Like

I think a simpler explanation of AD as alternate interpretation for the same abstract program text, or as compiling the text is given in this paper of mine:

… assuming you are willing to see AD in utmost simplicity in the context of Lisp.
I think the same framework (using a package to overload the usual operations) could be used for sparsity detection, but I have not looked at this in any detail.


You have reconstructed my foray correctly. The only alteration I had made to the PR branch was to change the default for that parameter from false to true … and with that change, the function matched its docstring.

Upon noticing a test had failed, and determining that the failure more likely than not had occurred as a consequence of my edit – I let go of the PR. Wholly unfamiliar with Symbolics.jl (and totally unaware of how it tests), changing the default state of a boolean flag to match the docs seemed without danger. It could be that the inline doc was in error, not the setting (then there would be some other error that fails when you tried to play). If that is not the case, then some other edit has been overlooked or underbaked – either way, to resolve this requires the (possibly brief) attention of someone with project local expertise.