With ModelingToolkit in development, I’d like to start using it soon – also for my basic CAS needs.

I recently played around with SymPy, and wonder whether ModelingToolkit can handle my cases there… Simple example: show that the Park-Clark transformation matrix (for electrical machines) is orthogonal:

I don’t see why it wouldn’t. If it didn’t it would just be a missing rule, but in theory there’s enough features in there that almost the same code should work.

Make it another symbol?

The core simplify handles this. Doing trig relations and all of that is just about having the right rules.

Thanks for response – but it is still somewhat vague to me:

Although the tutorial of ModelingToolkit makes reference so Sym, I couldn’t find any examples of how to use it.

When I figure out how to make symbols, I can do so for \pi. But for simplification to work, ModelingToolkit must be made to understand what \sin (\theta_a - 2\pi/3) etc. means – which is not straightforward?? Of course, in my example I can avoid the problem of \pi by using sind() and replace 2\pi with 360…

Last time I played around with modeling toolkit I noticed that it didn’t know the addition theorems for trigonometric functions. So clearly there were essential rules missing. (It worked fine in SymPym.) Where would one add such a rule in the repo? Also, can’t we systematically copy the rules from SymPy somehow? They must define them somewhere as well.

Also note that modeling toolkit outputs don’t by default render as latex in jupyter while SymPy outputs do. I found it a bit annoying to have to add conversions everywhere.

True for SymPy, but if I latexify the output of SymPy, the code is of lower “quality” for some reason (removes \frac, introduces “ugly” * for multiplication).

Anyway, ModelingToolkit seems to be in active development, and I like a lot of its ideas.

I’ve talked with @shashi about having separate environments for having larger sets of rules as well, and extendable rules.

It’s definitely in a major churn phase still.

Did you use latexify(eqs)? It spits out Latex like:

Is what you’re missing the notebook integration? Maybe @stevengj could help us get that setup with IJulia? I assume it’s just a show+mimetype we need to overload.

@korsbo isn’t there an option for this? I would just like to remove the * altogether. I would argue that should be the default, and adding the * should be a non-standard option.

Good to hear! Yes, ModelingToolkit isn’t all there yet, but since it’s pure Julia a lot of these “add a few more rules” things are just adding known analytical solutions to lists of Julia code, so everyone is invited to help. Many small PRs from many people are likely what will grow this area of the ecosystem. @shashi and @YingboMa are blazing a trail to do a full demonstration of a purely Julia Modelica-like system in just a few months, but even when we get there we will still need to ensure we hit all of these “high level bits”, like Latexification, to meet the goals of the community. I plan to make sure we get there.

Okay, I’ll make sure to ping you when I get there. I have a few grant proposals first, but I’ll be getting to writing a bunch very soon. One of the major moves in the next month will be documenting the full SciMLInterface (as a new repo). To standardize the symbolicification of various mathematical domains, we’re greatly expanded SciML’s problem/solve interface:

There’s NonlinearProblem, OptimizationProblem, etc. And then there’s a bunch of overarching solver libraries that give a unified interface to these descriptions, such as Quadrature.jl and GalacticOptim.jl. So what was “DiffEq Problems” now includes all of the core numerical algorithms with a related symbolic system type in ModelingToolkit, which allows for going back and forth between symbolic manipulations and numerical solving. The major goal here will be to have clear documentation showing the symmetries of the design and interface. While GalacticOptim.jl’s documentation is just “hey I am all optimization packages”, showcasing how optimization, differential equation solving, nonlinear solving, etc. all have the same design and how it couples to the symbolic interfaces has been challenging to document.

The next underlying issue is how to document the PDESystem interface. PDESystem now exists and is usable. If you write:

using NeuralPDE, Flux, ModelingToolkit, GalacticOptim, Optim, DiffEqFlux
using Plots
# 3D PDE
@parameters x y t
@variables u(..)
@derivatives Dxx''~x
@derivatives Dyy''~y
@derivatives Dt'~t
# 3D PDE
eq = Dt(u(x,y,t)) ~ Dxx(u(x,y,t)) + Dyy(u(x,y,t))
# Initial and boundary conditions
bcs = [u(x,y,0) ~ exp(x+y)*cos(x+y) ,
u(0,y,t) ~ exp(y)*cos(y+4t)
u(2,y,t) ~ exp(2+y)*cos(2+y+4t) ,
u(x,0,t) ~ exp(x)*cos(x+4t),
u(x,2,t) ~ exp(x+2)*cos(x+2+4t)]
# Space and time domains
domains = [x ∈ IntervalDomain(0.0,2.0),
y ∈ IntervalDomain(0.0,2.0),
t ∈ IntervalDomain(0.0,2.0)]

You can send that off to DiffEqOperators.jl for automated finite difference discretizations and solve, or send it off to NeuralPDE.jl for a physics-informed neural network solver.

These are all expected growing pains moving from JuliaDiffEq -> SciML and taking on an expanding role of course. And most of the libraries are in place, so it’s just a big documentation effort at this point.

This is all on top of the fact that ModelingToolkit couples being a CAS with its role in symbolic-numerics. Of course, it makes complete sense for a symbolic-numerics system to be a CAS because then every CAS feature for eliminating variables and simplifying equations can be done on the equations you want to numerically solve, but fully documenting a CAS is hard, and making there be a clear understanding of this relationship is a bit harder, especially since in this case there isn’t really a clear analogue in other ecosystems to this full feature. For differential equation solvers it’s like “oh it’s like odeint but better”, but ModelingToolkit’s ODESystem? Well it’s like you deconstructed Dymola’s compiler into features of a SymPy-like CAS and then coupled it to a differential equation solver like CASADI and allowed special hooks into the numerical solvers for MTK-generated functions (features like plot(sol,vars=x) where x is a symbolic variable from ModelingToolkit). I’ve learned that description helps nobody, so it’s back to square one on that front.

It sure sounds like one hell of a task, and one hell of a vision. Really pioneering what one can do with a modern high level and fast fanguage like Julia.

I am a silent admirer from afar, but I just wanted to say that I am psyked for the progress you are making and work you are doing, and wish you god-speed in it.

I am also very much up for testing and giving feedback on documentation - i am not experienced in computer science or advanced mathematics, but I find myself quite functional a person, so that I think I could be a good fit for documentation-testing

Comparison Against SymPy · ModelingToolkit.jl seems to be a list of reasons to use ModelingToolkit.jl over SymPy. What are the advantages of SymPy? Are there use cases in which SymPy would be more suitable?

There’s more in SymPy right now. Like analytical solutions to some integrals. But other than that… really no. And the SymPy devs know it too, that’s why SymEngine exists to completely redo the internals of SymPy. ModelingToolkit (whose CAS portions will soon be Symbolics.jl) is actually pretty similar in its underlying structure to the C++ structures in SymEngine, and matches its performance (@shashi is then adding a few more pieces to even go beyond that, but that’s details).

So you can think of this project as a very actively developed SymEngine in pure Julia, where SymEngine is the “better developed” SymPy with missing features. But many of the features in SymPy that are missing from SymEngine, like equation solvers, already exist in ModelingToolkit(/Symbolics.jl), which is why it’s basically moved into the territory of “vs SymPy” instead. We still have a long road ahead of us to cover all that is required in symbolic computing of course, but there’s already plenty of areas where we are far ahead (like build_function essentially being “lambdify” which supports outputting optimized non-allocating functions on sparse matrices). But there are some very good reasons to believe that this project is well-funded and well-staffed which will be explained more at JuliaCon, so it’s pretty far along and taken very seriously.