Generate functions inside a function from string


I noticed that the function generated by eval is always a global function, making the code silly if I would like to generate some function defined in string. Are there any other ways to do something like this while avoid using eval? Examples are:

function foo()
    return eval(parse("x->x"))
# ERROR: MethodError: no method matching (::##9#10)(::Int64)
# The applicable method may be too new: 
# running in world age 21870, while current world is 21871.

function foo()
    return x->x
foo()(1)  # works fine

By the way I noticed there is a question mentioned this problem but no useful solution there.

Related discussion: World age error when parsing function from String

Do you have any workarounds?

No more than the two I described in the linked discussion.

Use eval to define foo not in the body of foo:

julia> @eval foo() = $(Meta.parse("x->x"))
foo (generic function with 1 method)

julia> foo()
#5 (generic function with 1 method)

julia> foo()(1)

Are you entirely certain you actually need to be defining functions by parsing strings? This can certainly be done, but it’s a design that can lead to fragile code, security vulnerabilities, and/or performance issues. Can you describe the actual problem you’re trying to solve?

1 Like

FWIW, there’s a hack that allows you to define functions from expressions at runtime, without world age issues:

julia> funs = []
0-element Array{Any,1}

julia> @generated make_fun(::Val{N}, x) where N = funs[N]
make_fun (generic function with 3 methods)

julia> evil(code) = (push!(funs, code); x->make_fun(Val{length(funs)}(), x))
evil (generic function with 1 method)

julia> evil(:(x+2))(10)

invokelatest is way cleaner though. And unless you’re doing stuff like evolutionary programming, you should consider using macros instead of parsing strings.

1 Like

invokelatest and other hacks are only necessary if you’re calling code from something that’s long-running and can’t get recompiled and called afresh later when you (re)define foo. For example, the REPL itself. If you define foo() using eval and then call foo() later it will work fine without any additional tricks just as I demonstrated above. For example:

function f1(code::String)
    @eval g1() = $(Meta.parse(code))
    g1() # calls the old definition of `g1` if any

julia> f1("1 + 2")
ERROR: MethodError: no method matching g1()
The applicable method may be too new: running in world age 27552, while current world is 27553.
Closest candidates are:
  g1() at REPL[1]:2 (method too new to be called from this world context.)
 [1] f1(::String) at ./REPL[1]:3
 [2] top-level scope at none:0

julia> f1("'x'")

julia> f1("3/5")
'x': ASCII/Unicode U+0078 (category Ll: Letter, lowercase)


function f2(code::String)
    @eval g2() = $(Meta.parse(code))
    Base.invokelatest(g2) # calls the new definition of `g2`

julia> f2("1 + 2")

julia> f2("'x'")
'x': ASCII/Unicode U+0078 (category Ll: Letter, lowercase)

julia> f2("3/5")

If you’re defining foo at the top of a script and then calling it later, you don’t need to do anything special.

1 Like

My solution to this problem is implemented in the SyntaxTree package (on the most recent master branch)

julia> using SyntaxTree

julia> SyntaxTree.genfun(:(x^2+1),[:x])
(::#1) (generic function with 1 method)

julia> ans(2)

This automatically handles the Base.invokelatest action for you, so all you need to do is specify an expression and a list or function arguments.

Well, I am working with SymPy. I would like to build a symbolic matrix and then numerically calculate it. However I found SymPy cannot do the calculation fast enough, e.g.

using SymPy

@vars x


@time subs(expr,x=>1.0)
@time f(1.0)
@time g(1.0)

# timing:
# 0.000365 seconds (80 allocations: 2.453 KiB)
# 0.000003 seconds (5 allocations: 176 bytes)
# 0.000016 seconds (9 allocations: 240 bytes)

I found eval is the fastest. As I have to diagonalize a rather large matrix many times (e.g. ten thousand of 400*400 matrices), I want to make the performance better.

Besides, I thought building something from string should not be a very uncommon thing…

Have you tried using

for symbolic calculation?

using SyntaxTree, Reduce

expr =,:x)
f = SyntaxTree.genfun(repr(expr),[:x])

Don’t know that repr is though

Now that you brought it up, I have been looking at Reduce.jl but don’t quite get it.

How’s Reduce.jl compared to Sympy and Maxima, in the sense of speed and features?

Reduce.jl was initially forked from Maxima.jl and modified for Reduce CAS, but since then it has many more generalized features than Maxima.jl currently has due to the parser generator. The premise behind Reduce is different than that of SymPy. In SymPy, new symbol objects are defined, in Reduce the native Julia symbol and expression type is used and translated into Reduce commands.

I have not done a benchmark to compare the speed with SymPy, but it should be fairly quick (although not as quick as SymEngine yet). There are multiple optimizations and rewrites planned to increase the performance in various aspects of the package, however, it will require some tweaking and fine tuning.

My recommendation is to try it out and let me know your own thoughts about it.

Also, it seems your code only works for me without the repr, and the function produced f is type unstable?

julia> using SyntaxTree, Reduce

julia> expr =,:x)

julia> f = SyntaxTree.genfun(expr,[:x])
(::#1) (generic function with 1 method)

julia> @code_warntype(f(1.0))

      SSAValue(0) = $(Expr(:foreigncall, :(:jl_toplevel_eval_in), Any, svec(Any, Any), :(SyntaxTree.SyntaxTree), 0, :((Core.getfield)(#
self#, :gs)::Symbol), 0))
      return (Core._apply_latest)(SSAValue(0), (Core.tuple)(x::Float64)::Tuple{Float64})::Any

As I said in my post, I don’t have any information about the original author’s repr method.

Yes, you’re right about the type instability, might have to tweak the implementation, it’s a concept design.

Well, I only used repr to generate strings, if you already have an expression, I don’t think you have to use it. In my code, parsing has changed the string to expressions.

Thank you.I’ll try the package next time I do symbolic calculation.

This is really good, 1 allocation more and almost comparable time.

using SymPy
using SyntaxTree

@vars x


@time f(1.0)
@time h(1.0)
#   0.000003 seconds (5 allocations: 176 bytes)
#   0.000005 seconds (6 allocations: 192 bytes)

Note that the input to SyntaxTree.genfun doesn’t need to be a function, it should only be the expression of the function that is added, so you might want to use parse("$(repr(expr))") instead, since it automatically turns it into the function using the arguments from the list.

Any consequences for invokelatest?