Best symbolic calculus/algebra package(s) for Julia?

So I’ve been trying out python lately but I learned about Julia and it seems pretty amazing so I want to switch to it.

I set up juno and it’s working fine but I’ve been having trouble finding resources concerning the various symbolic math packages and the benefits and features of them. So far reduce.jl seems interesting and very capable but I’m struggling to find tutorials or help about how to use it. The website isn’t much help. I’m trying to solve for antiderivatives and do lin alg.

Other’s Ive looked at:

symata - apparently you can’t auto solve functions with it? sorta confused about it and little online about it

pyengine

sympy - how powerful is this?

6 Likes

Use sympy.jl. It is by far the most complete and documented symbolic math library, although it is a wrapper of SymPy python library, so not the fastest one. But in symbolic math it is rare you are computationally bounded.

6 Likes

Hello dear bro :slight_smile: I use Reduce (GitHub - chakravala/Reduce.jl: Symbolic parser generator for Julia language expressions using REDUCE algebra term rewriter)

Several months ago, I compared many CAS (Symbolic Packages) for Julia, for example: SymPy, Symata, PyEngine and Reduce. Although it doesn’t have Integral Transforms or something like MATLAB or Wolfram Mathematica, it really works well.

I recommend you that :slight_smile:

2 Likes

ModelingToolkit does a lot, but sadly not antiderivatives yet. We are going to train a neural network solution to that soon though.

3 Likes

Is it something similar to this?
https://www.quantamagazine.org/symbolic-mathematics-finally-yields-to-neural-networks-20200520/

1 Like

Yes.

3 Likes

Wow interesting, will keep an eye out for that

2 Likes

Thanks! What other packages do you use that work well with reduce?

1 Like

Would such a system use automatic or symbolic differentiation to check the results?

That’s essentially how you train it. You fuzz a training set of symbolic equations that you symbolically differentiate, and then train a neural network to anti-differentiate the equations by learning the reverse mapping on this dataset. We have an MLH student developing this method over the summer, so hopefully we can get something going soon enough.

3 Likes

@ChrisRackauckas Have you looked at rule-based integration? The rubi rules could be a good starting point. https://rulebasedintegration.org/

1 Like

Hello again :slight_smile: For example right now, I am writing a paper and I’m using these packages:

using Distributions
using LinearAlgebra
using Plots
using Printf
using Reduce
using Revise
using SparseArrays

Rules are hard. But yes, we do plan to get those into SymbolicUtils.jl. @HarrisonGrodin actually had that idea quite awhile ago. It’s just easier to get students hired to train a neural network.

2 Likes

@ChrisRackauckas I was thinking that the rubi rules could be a good training set as their underlying distribution has been hand picked to be around the most general solutions for each class of integration problems. Rubi also gives intermediate steps that can be used to generate increasingly difficult problems.
Not sure you can say, but do you plan to do the seq2seq approach from the Facebook paper?

3 Likes

Also if you can use a distributed blockchain in the cloud!
:roll_eyes:

2 Likes

Yeah so it’s a good to have. We’ll see where it ends up. It’s fun though