I’ve been using python and sympy to build up a series of quite large symbolic equations. I then use python to evaluate them numerically and solve for minima (using a combination of lambdify and scipy optimize).
The script is very slow and I was wondering if the speed advantages of julia carry over to the sympy.jl wrappers. I just don’t want to get stuck into converting the model and find it isn’t actually any faster. Is the sympy.jl going to give outputs any faster? also is there an equivalent lambdify wrapper for it. I tried looking the the docs but couldn’t see anything specific on speed comparisons.
Apologies if this is an obvious question. Also if anyone know a more native symbolic maths module for julia please let me know.
Thanks for the time. I know there was a similar question in '18. I was hoping there might have been some changes since then.
ModelingToolkit’s OptimizationSystem is made for this, and it should be a whole lot faster than SymPy last time I checked. Here’s some quick examples using it:
generate_function(combinedsys) spits out a (x,p) function, but you can close over parameters to use with standard optimization packages.
SymPy.jl is just calling Python’s sympy package from Julia — it’s convenient if you are doing other calculations in Julia and want to combine them with some symbolic math, but the SymPy calls themselves will not be any faster.
However, if most of your time is spent not in SymPy itself but rather in evaluating your “lambdified” equations (i.e. Python functions) within the optimization code, then that can almost certainly be made much faster by taking the SymPy expressions and “lambdifying” them into corresponding Julia functions instead.
Even better is to do everything in Julia, either using the ModelingToolkit as @ChrisRackauckas suggests, or something else. It’s hard to give more specific advice since you don’t give any details on your equations (e.g. why do you need SymPy for this?).
@ChrisRackauckas that looks good I will have a play. Can I define matrix variables?
@stevengj thanks yeah I think a lot of time might just be with the helper functions. But from what you say I think the julia version would be quicker. The reason I started in sympy is that I’m building the equations iteratively from a load of biological assumptions (it’s an evolutionary game theory model) so I do a lot of looping over matrices and adding terms or subbing in expressions. Then once I have the system I take gradients and lambdify to solve for an equilibrium.
I will look into trying to use modellingtoolkit or just using julia generated functions which I think will be quicker.
You can define a matrix of variables (i.e. a Julia matrix of variables) and that will work, but that’s slightly different from variables as matrices which is something @shashi is currently looking into.
Indeed, JuMP is a pretty good choice if it’s just for optimization, though iteratively building equations isn’t its strong point. But indeed, @SanAlphaTau should check out JuMP first.
a wrapper around python sympy that produces algebraic expressions.
a convenient way to “lambdify” these expressions into efficient Julia functions. These functions can be used directly with optimization packages, such as NLopt.jl.
Whether this will be faster or easier than doing this purely in python depends on the details of your problem.
I can see this must be possible, since it appears to translate to and from Expr. But, I don’t see an explicit example in the docs. Do you have a link ? I think this is perhaps the most interesting feature of these kinds of packages.
Well, that is kind of an exercise for the user to try… since the input and output can be Julia experessions, it is possible for the user to utilize the results in whatever Julia language metaprogramming style they wish.
One particular example is the SyntaxTree.genfun method, which can be used to generate a function for an expression. Wilkinson.jl is a mathematical investigation of IEEE roundoff error in polynomials which uses genfun to compare and analyze the evaluation of generated expressions. Another place this is used is in Fatou.jl where optimized code for Newton fractals is generated similarly.
Sympy.jl tries to provide a Julia-like interface to python SymPy. They are intended to be used, for the most part, programatically. Symata tries to follow Mathematica semantics. In particular it is a separate language written in the host language Julia. However, I put a fair amount of effort into allowing a user to break this separation, and to interact with Julia. In particular, one can use Symata from Julia programs. In fact, that seems to be the way people prefer to use it.
apologies for necroing an old thread I just got back to the old paper that used this after sidelining it for a while.
@variables H[1:3, 1:3]
expr = H * rand(3,3)
myFun = build_function(expr, [x for x in reshape(H, (9,1))])
callFun = eval(myFun[1])
callFun(ones(3,3))
Just started to try to build expressions but running into some issues when creating callable functions. The idea would be to have myFun and then find a root which leads to a zero matrix output. passing matrix variables is a bit awkward. With one its okay with the unpacking using a list comprehension, but with two it kind of loses all elegance:
@variables H[1:3, 1:3]
@variables G[1:3, 1:2]
expr = H*rand(3,3)*G
myFun = build_function(
expr,
vcat([x for x in reshape(H, 9, 1)], [x for x in reshape(G, (6,1))])
)
callFun = eval(myFun[1])
callFun(ones(12,1))
I think I must be doing something wrong here. Apologies if this is already in the docs somewhere I looked through some examples but couldn’t find anything similar, or at least couldn’t see the parallels myself. Is there a way to cleanly pass matrix variables to build_function. I’m not very competent but I could try implementing a matrix variable type and just have some mapping code that does all the reshaping and passing for me which looks like the best option so far.