I have the following Expression:
ex = :(sum(x[i] for i=1:4)- y[1] * y[2] + z)
which I would like to evaluate over some data x, y, z. I have tried doing direct substitution through symbolic manipulations, but it is abysmally slow (~0.2s per set of substitutions).
It seems the right approach is to turn it into a function as here, but the application of that is somewhat limited.
Is there a method for converting ex
into a function like the following?
f(x,y,z) = sum(x[i] for i=1:4)- y[1] * y[2] + z
Thanks in advance!
1 Like
The standard questions apply here:
- How did you end up with expressions?
- Are they based on user input?
- Can you handle the problem you’re trying to solve using macros?
- If you can only get access to the expressions at runtime in a way that rules out macros, you’ll likely need to use eval, but that is generally more challenging to do well and harder to make performant.
4 Likes
Is that used to parse symbolic expressions into functions, ie allow the output of ModelingToolkit to be used in Plots, or possibly another CAS?
I had a similar problem with multi-variate polynomials. You can take a look into the package
https://github.com/hofmannmartin/SphericalHarmonicExpansions.jl
There is a fastfunc
macro and fastfunc
function, which can turn any multi-variate polynomial into a fast evaluating function. An example for its usage can be found in here.
I hope this helps
1 Like
Thanks @johnmyleswhite, here’s the best answer I have:
- I end up with the expressions through representations of constraints of optimization problems. They can be user-generated or read directly from a file. But in general, there are too many of these to be able to accurately and within reasonable time transcribe into functions.
- I’m not sure exactly how macros help here, but my knowledge of them is limited. If you can elaborate that would be appreciated.
- It is clear to me that eval is not an option, since performance is a significant issue.
@ChrisRackauckas thanks for the package link. I tried the following, but it throws an error, likely because there is the embedded sum:
ex = :((x, y, z) -> sum(x[i] for i=1:4)- y[1] * y[2] + z)
using RuntimeGeneratedFunctions
RuntimeGeneratedFunctions.init(@__MODULE__)
f = @RuntimeGeneratedFunction(ex)
f(ones(4), ones(2),5) #ERROR: The function body AST defined by this @generated function is not pure.
#This likely means it contains a closure or comprehension.
Furthermore, it’s not clear to me how to actually specify the arguments given just the right hand side of the expression. I guess I have a lot to figure out about what is going on under the hood.
1 Like
It’s the generator expression x[i] for i=1:4
. Does GeneralizedGenerated.jl work here or does it have the same issue? I think it does have the same issue. I would just avoid the generator in the generated code.
GeneralizedGenerated.jl doesn’t work either… it seems like the only robust way to get around these issues is to go towards generating functions from the beginning!
Or use eval
. Eval won’t have overhead but you just have to watch out for world-age issues.
Hi
I would like to find out if you eventually obtained a solution as I am currently having the same problem and cannot seem to find a solution.
If you are on 1.7, you could instead write this as:
using Base.Experimental: @opaque
ex = :((x, y, z) -> sum(Base.Generator(@opaque(i -> x[i]), 1:4)) - y[1] * y[2] + z)
and then use RuntimeGeneratedFunctions on ex
as described above.
1 Like
following the directions
f(ones(4), ones(2),5)
#ERROR: The function body AST defined by this @generated function is not pure.
8.0
is the error spurious or else is there a fix?
Can you try this in a fresh session? I don’t get any errors:
julia> using Base.Experimental: @opaque
julia> ex = :((x, y, z) -> sum(Base.Generator(@opaque(i -> x[i]), 1:4)) - y[1] * y[2] + z)
:((x, y, z)->begin
#= REPL[2]:1 =#
(sum(Base.Generator(#= REPL[2]:1 =# @opaque((i->begin
#= REPL[2]:1 =#
x[i]
end)), 1:4)) - y[1] * y[2]) + z
end)
julia> using RuntimeGeneratedFunctions
julia> RuntimeGeneratedFunctions.init(@__MODULE__)
julia> f = @RuntimeGeneratedFunction(ex)
RuntimeGeneratedFunction(#=in Main=#, #=using Main=#, :((x, y, z)->begin
#= REPL[2]:1 =#
(sum(Base.Generator(#= REPL[2]:1 =# @opaque((i->begin
#= REPL[2]:1 =#
x[i]
end)), 1:4)) - y[1] * y[2]) + z
end))
julia> f(ones(4), ones(2),5)
8.0
Does it work on 1.6 as 1.7 does not seem to be available
No, you will need 1.7. You can download the nightly builds here, but be warned that it is still in active development, so you might encounter some bugs.
2 Likes
Thank you. A clean REPL worked without error.
Could someone clarify please, what problems of eval
approach are?
I had a similar problem recently solved with eval
using InteractiveUtils
function give_xyz_header_eval(expr)
fexpr = quote
f(x,y,z) = $expr
end
eval(fexpr)
end
# tests
evaluated_f = give_xyz_header_eval(:(x^2+y^2+z^2-2x*y))
@code_warntype evaluated_f(1,2,3) # looks fine
evaluated_fsum = give_xyz_header_eval(:(sum(x[i] for i=1:4)- y[1] * y[2] + z))
@code_warntype evaluated_fsum(ones(4),ones(2),3) # looks fine
eval
is fine for use at the toplevel.
However, if you try to call a function inside the same function you are defining that function in with eval
like, e.g.:
function foo(a)
evaluated_f = give_xyz_header_eval(:(x^2+y^2+z^2-2x*y))
return evaluated_f(a, a, a)
end
, you will run into worldage errors when trying to run foo
.
A workaround to this is to invoke evaluated_f
with Base.invokelatest(evaluated_f, a, a, a)
, but this has significant overhead and the result of evaluated_f
will always be inferred as Any
, so foo
can never be type stable.
@RuntimeGeneratedFunction
works around this by using some hacks involving generated functions to achieve the same thing without the overhead and without giving up on type inference.
2 Likes