So I’ve been trying out python lately but I learned about Julia and it seems pretty amazing so I want to switch to it.

I set up juno and it’s working fine but I’ve been having trouble finding resources concerning the various symbolic math packages and the benefits and features of them. So far reduce.jl seems interesting and very capable but I’m struggling to find tutorials or help about how to use it. The website isn’t much help. I’m trying to solve for antiderivatives and do lin alg.

Other’s Ive looked at:

symata - apparently you can’t auto solve functions with it? sorta confused about it and little online about it

Use sympy.jl. It is by far the most complete and documented symbolic math library, although it is a wrapper of SymPy python library, so not the fastest one. But in symbolic math it is rare you are computationally bounded.

Several months ago, I compared many CAS (Symbolic Packages) for Julia, for example: SymPy, Symata, PyEngine and Reduce. Although it doesn’t have Integral Transforms or something like MATLAB or Wolfram Mathematica, it really works well.

That’s essentially how you train it. You fuzz a training set of symbolic equations that you symbolically differentiate, and then train a neural network to anti-differentiate the equations by learning the reverse mapping on this dataset. We have an MLH student developing this method over the summer, so hopefully we can get something going soon enough.

Rules are hard. But yes, we do plan to get those into SymbolicUtils.jl. @HarrisonGrodin actually had that idea quite awhile ago. It’s just easier to get students hired to train a neural network.

@ChrisRackauckas I was thinking that the rubi rules could be a good training set as their underlying distribution has been hand picked to be around the most general solutions for each class of integration problems. Rubi also gives intermediate steps that can be used to generate increasingly difficult problems.
Not sure you can say, but do you plan to do the seq2seq approach from the Facebook paper?