How to replace code with string in julia

My goal:
I want to write a function that can generate functions based on user input string without much performance hit. For example

str = "x^2 + x^3"
function gen_func_from_str(str::String)
    function f(x)
         # something to make 
         # y = x^2 + x^3
    end
end

and the call gen_func_from_str(str) should return a function actually do what the str says.
I have tried to use macro to resolve this, only to found that macro operate on code it self. So if I define a macro say @gen_func, using @gen_func(str) won’t do it as the macro only sees the str as a Symbol :str.
Anther option is to use eval(Meta.parse(str)) which seems to put the result in global scope with performance hit.
My second thought is that maybe I should manipulate the code directly? For example, define the function like

function gen_func_from_str(str)
    new_str = "f(x) = " * str
    # create a file name tmp.jl
    # echo $new_str > tmp.jl
   include(path_of_tmp.jl)
   return f(x)
end

but this implementation seems to be quite strange…

Now I feel I am not heading at the right direction at all, should there be some straight forward implementation to gen_func_from_str?

Maybe just return an anonymous function instead?

function gen_func_from_str(str)
    anon_fn_str = "x->$str"
    return eval(Meta.parse(anon_fn_str))
end

# example
f = gen_func_from_str("x^2+2x")
f(2) # returns 8

You probably won’t get something much more performant than this as you will always need to parse the string and evaluate it to get a function. This way saves having to name the function and pollute the namespace.

Or check out:

1 Like

Thank you @jmair and @oheil !
A little benchmarks shows that both solutions offers almost the same performance as anonymous functions, only that the second one avoids the use of eval. But the performance hit seems to be quite a lot compared to a standard Julia function. See the example:


using RuntimeGeneratedFunctions
RuntimeGeneratedFunctions.init(@__MODULE__)

function gen_func1(str)
    expression = Meta.parse("(x)->$str")
    f = @RuntimeGeneratedFunction(@__MODULE__, expression)
end

function gen_func2(str)
    anon_fn_str = "x->$str"
    return eval(Meta.parse(anon_fn_str))
end

func_baseline(x) = x^2 + 2x
func_anonymous = (x -> x^2 + 2x)
func_eval = gen_func1("x^2+2x")
func_rt_gen = gen_func2("x^2+2x")

using BenchmarkTools
@btime func_baseline(2) # 0.875 ns (0 allocations: 0 bytes)
@btime func_anonymous(2) # 14.097 ns (0 allocations: 0 bytes)
@btime func_eval(2) # 13.179 ns (0 allocations: 0 bytes)
@btime func_rt_gen(2) # 13.471 ns (0 allocations: 0 bytes)

And a tryout ugly implementation by directly manipulate the code file:

using Random
fn_name = randstring(5) # "Lyq1v"
str = "$fn_name(x) = x^2 + 2x"
open("/tmp/$fn_name.jl", "w") do io
    print(io, str)
end
include("/tmp/$fn_name.jl")
func = getfield(@__MODULE__, Meta.parse(fn_name))
@btime func(2) # 13.471 ns (0 allocations: 0 bytes)
# In case I know the name of the function
@btime Lyq1v(2) # 0.875 ns (0 allocations: 0 bytes)

Why such a big difference when just aliased Lyq1v to func?

I think the problem is that func_anonymous by itself is just a global variable (pointing to the the anonymous function.
If the reference to the anonymous function is in a local scope, (i think also when it is passed to another function) it is also fast and all are similar in performance.
I think it is more reasonable to interpolate global references to anonymous functions in benchmarks, since that is likely reflecting better how they will be used later on:

@btime func_baseline(2) #  1.128 ns (0 allocations: 0 bytes)
@btime $func_anonymous(2) # 1.160 ns (0 allocations: 0 bytes)
@btime $func_rt_gen(2) # 1.127 ns (0 allocations: 0 bytes)
1 Like

Non-const globals are bad for performance. Try

const func_anonymous = (x -> x^2 + 2x)
const func_eval = gen_func1("x^2+2x")
const func_rt_gen = gen_func2("x^2+2x")

Thank you for the tips! I am new to Julia, I didn’t know that global functions are also slow.

Functions that you define in global scope are no problem,
Here the issue is more than you create an anonymous function (which is fast by itself) but you afterwards create a global, non constant variable as an alias Name for the function. Since this variable could be of any type (you can make it a string in the line below), the compiler will not be able to figure out it’s type at compile time and the code is therefore slower.

Why? Why not have the user pass a function instead? e.g. pass (x,y) -> x^2 + y^2 instead of "x^2 + y^2".

I am trying to create an FEA applications that user can define custom boundary condition by inputing text in a Qt application. Currently the limitation is that users don’t have a julia REPL exposed, so the inputs are basically strings.

Yes, in that case Meta.parse and eval or similar (e.g. include_string to provide a REPL-like interface) are the main options. Should be fine in a real context where you only pay the dynamic-dispatch cost once (e.g. when you call your matrix-assembly function), not in your inner loops (as when you call with @btime using non-constant globals) – it’s easy for micro-benchmarks to be misleading here.

Thank you for the tips!