Evaluation in calling scope?

from docs, eval() “Evaluate an expression in the global scope of the containing module” as follows:

julia> ex = :(x + 1)
:(x + 1)

julia> f() = (x = 10; println(x + 1, " ", eval(ex) ); g() )
f (generic function with 1 method)

julia> g() = (x = 20; println(x + 1, " ", eval(ex) ); h() )
g (generic function with 1 method)

julia> h() = (x = 30; println(x + 1, " ", eval(ex) ) )
h (generic function with 1 method)

julia> x = 0
0

julia> f()
11 1
21 1
31 1

but I wanna evaluate the expression at the calling scope, how could I achieve the following:

julia> f()
11 11
21 21
31 31

? thanks. :pray:

There are probably workarounds to achieve what you want to achieve but the straight-forward answer is just that you cannot eval in calling scope since eval evaluates in global scope.

4 Likes

If at all possible, use a function instead. Otherwise see Passing a string into function as a expression.

2 Likes

thanks @GunnarFarneback. Inspired by your suggestion and other searches of “local evaluation”, I come up with the following not-so-perfect solution that only works for very simple expressions:

function expr2fun(ex)
    vars = getvars(ex)
    args = Expr(:tuple, vars...)
    return eval(:($args -> $ex) )
end

getvars(ex::Symbol) = [ex]
function getvars(ex::Expr)
    vars = Symbol[]
    if ex.head == :call
        for arg in ex.args[2:end]
            vars = [vars..., getvars(arg)... ]
        end
    end
    return unique(vars)
end
getvars(ex) = []        # fallback

f() = (x = 10; y = 1; println(x * (y - x), " ", fun(x, y) ); g() )
g() = (x = 20; y = 2; println(x * (y - x), " ", fun(x, y) ); h() )
h() = (x = 30; y = 3; println(x * (y - x), " ", fun(x, y) ) )

x = 0; y = 0
ex = :(x * (y - x) )
fun = expr2fun(ex)

julia> methods(fun)
# 1 method for anonymous function "#3":
[1] (::var"#3#4")(x, y) in Main at REPL[1]:4

julia> f()
-90 -90
-360 -360
-810 -810

in essence, use expr2fun(ex) to turn an expression into a function, and then call locally f(...) rather then eval(ex).

Unfortunately, it has a huge performance penalty:

f1(x, y) = x * (y - x)
f2(x ,y) = fun(x, y)

julia> @btime f1(10, 20);
  0.022 ns (0 allocations: 0 bytes)

julia> @btime f2(10, 20);
  20.995 ns (0 allocations: 0 bytes)

julia> @btime fun(10, 20);
  17.771 ns (0 allocations: 0 bytes)

julia> @code_typed f1(10, 20)
CodeInfo(
1 ─ %1 = Base.sub_int(y, x)::Int64
│   %2 = Base.mul_int(x, %1)::Int64
└──      return %2
) => Int64

julia> @code_typed fun(10, 20)
CodeInfo(
1 ─ %1 = Base.sub_int(y, x)::Int64
│   %2 = Base.mul_int(x, %1)::Int64
└──      return %2
) => Int64

I do not understand where is the penalty from??? from @code_typed, f1 and fun should be the same … :dizzy_face:

I would recommend using append! there.

The difference is most likely related to inlining and/or constant propagation. However,

is not the time it takes to perform a computation. What you see is constant propagation in action, basically measurement noise for the time to return a precomputed value.

1 Like

@GunnarFarneback, modified with inlining and your suggestions:

function expr2fun(ex)
    vars = getvars(ex)
    args = Expr(:tuple, vars...)
    # return eval(:($args -> $ex) )
    # is below the correct way to inline???
    return eval(:($Expr(:meta, :inline); $args -> $ex) )   
end

getvars(ex::Symbol) = [ex]
function getvars(ex::Expr)
    vars = Symbol[]
    if ex.head == :call
        for arg in ex.args[2:end]
            append!(vars, getvars(arg) )
        end
    end
    return unique(vars)
end
getvars(ex) = []        # fallback

f1(x, y) = x * (y - x)
@inline f2(x ,y) = fun(x, y)

julia> @btime f1(1, 2);
  0.022 ns (0 allocations: 0 bytes)

julia> @btime f2(1, 2);
  17.376 ns (0 allocations: 0 bytes)

julia> @btime fun(1, 2);
  16.996 ns (0 allocations: 0 bytes)

julia> @btime $f1($1, $2);
  0.022 ns (0 allocations: 0 bytes)

julia> @btime $f2($1, $2);
  20.605 ns (0 allocations: 0 bytes)

julia> @btime $fun($1, $2);
  0.021 ns (0 allocations: 0 bytes)

now, only f2 seems slow, actually:

julia> @code_warntype f2(1, 2)
Variables
  #self#::Core.Compiler.Const(f2, false)
  x::Int64
  y::Int64
Body::Any
1 ─      nothing
│   %2 = Main.fun(x, y)::Any
└──      return %2

julia> @code_warntype fun(1, 2)
Variables
  #self#::Core.Compiler.Const(var"#3#4"(), false)
  x::Int64
  y::Int64
Body::Int64
1 ─ %1 = (y - x)::Int64
│   %2 = (x * %1)::Int64
└──      return %2

why??? why the warntype in f2 as fun itself has no warntype??? :dizzy_face:

besides,

then, how could I measure the “correct” computation time? thanks. :pray:

You need to somehow stop the compiler from having so much information that it can precompute the results. You could for example try broadcasting the operations over vectors. Yes, it will add some overhead but maybe it also becomes a bit more similar to a real use case.

As a rule of thumb, if the reported time is a fraction of a clock cycle, assume that the computation has been constant folded.

1 Like

Why are you doing this? What problem are you trying to solve? Why can’t you use a higher-order function instead?

5 Likes

I’m trying to build a model that composed of a lot of sub-models (something like a deep learning network). Ideally a complicated expression of the whole model could be built by chaining the (simpler) expressions generated by the sub-models. The final expression would then be evaluated: by this I could leverage (possible) compiler optimizations e.g. broadcasting, and potentially avoid a lot of temporaries allocations.

could u be more specific? (I don’t understand…) for example, giving an example? thanks.

Sounds like you could pass functions instead, since you can chain together functions as well. For example, this combines two single-variable functions f1(x) and f2(x) into a two-variable function with the combine(a, b) function:

combine(combiner, f1, f2) = (x,y) -> combiner(f1(x), f2(y))

combine(+, sqrt, exp) # returns a function (x,y) -> sqrt(x) + exp(y)

This is called a higher-order function in computer science, and is much more flexible (in any language) and much faster (in Julia) than working with raw expressions.

I feel like this should be a FAQ or a PSA post. 90% of the people who ask about evaluating expressions or strings in local scope should actually be using a higher-order function, in my experience.

13 Likes

thanks @stevengj. Yes, higher order function is concise and beautiful.

the problem is using the higher-order function approach would generate a lot of temporaries, e.g.:

combine((a, b) -> a .+ b, c -> c .- 1, d -> d .- 3)

in this case, unless super-clever-inlining is done (which I don’t know how to achieve), there will be temporary allocations from the results of f1 and f2.
on the other hand, the single expression:

(a, b) -> (a .- 1) .+ (b .- 3)

would be fused together and allocations could be avoided.

that said, in my understanding, a lot of clever compiler optimizations are available on expression level, yet not available across the barriers of functions.

Even if you write the most inefficient code possible using higher order functions, it is very likely that you would be orders of magnitude faster than anything involving eval.

That said, if you are worried about fusing, why don’t you just combine the functions on atoms, and broadcast that? Eg

combine(combiner, f1, f2) = (x, y) -> ((x, y) -> combiner(f1(x), f2(y))).(x, y)
f = combine(+, sqrt, exp)
f(1:3, 4:6)

That said, I would just work with functions that operate on atoms and do the broadcasting in the last step outside. But I have to admit that I lost track of what the problem is.

9 Likes

it’s because const is missing, see here

so, here’s the solution for calling scope evaluation, despite the fact that only relatively simple expression could be handled:

function expr2fun(ex)
    vars = getvars(ex)
    return eval(:($Expr(:meta, :inline); $vars -> $ex) )   # is it the correct way to inline???
end

getvars(ex) = Expr(:tuple, _getvars(ex)...)
_getvars(ex::Symbol) = [ex]
function _getvars(ex::Expr)
    vars = Symbol[]
    if ex.head == :call
        for arg in ex.args[2:end]
            append!(vars, _getvars(arg) )
        end
    end
    return unique(vars)
end
_getvars(ex) = Symbol[]  # fallback

ex = :(x * (y - x) - (x * y - 2) )
const fun = expr2fun(ex)        # const is needed to avoid warntype !

@inline f1(x, y) = x * (y - x) - (x * y - 2)
@inline f2(x ,y) = fun(x, y)    # calling slope evaluation

julia> f1(1, 2) == f2(1, 2)
true
julia> @btime f1(1, 2);
  0.022 ns (0 allocations: 0 bytes)
julia> @btime f2(1, 2);
  0.022 ns (0 allocations: 0 bytes)
julia> @btime fun(1, 2);
  0.022 ns (0 allocations: 0 bytes)

hope it helps.

1 Like

Have you thought about making expr2fun as a macro? Macro would basically do eval at the calling scope.

No that’s not what macros do. Macros can basically never do something you can’t do without them.

6 Likes

Why do you need to call eval in local scope though? You can easily do something like

expression = generate_expression_for_my_model(...)
run_my_own_optimizations!(expression)
@eval my_model(data) = $expression

function run_simulations(file)
    data=read_data(file)
    my_model(data)
end

and you can even call a function generating an expression during runtime, eval it to create a new function and then use Base.invokelatest to call that function.

2 Likes

it’s my original plan but failed. macro takes the argument as an expression. When I pass an expression into a macro, it becomes an “expression of an expression” inside the macro. So when I evaluate it (by $) “locally”, it turns back to the original expression, but this expression could not be $ again!

so far I found that the only way to do calling scope evaluation is to get thru a function: put the variables inside the expression as function arguments, and then get the expression evaluated locally inside the function.

1 Like

because the objects of the models are constructed inside a function. And the expression is for processing the fields of the objects. Afterall, I don’t think putting all objects in global is a good practice.

no matter how, I think the ability to do calling scope evaluation is a nice feature.

I’m not sure doing this way would be faster or not, but at least now I’m able to try and test it!

What I meant is that you can also do

ex(x) = :($x + 1)

function f(x)
    my_model = @eval (x) -> $(ex(x))
    println(x + 1, " ", Base.invokelatest(my_model, x))
end
4 Likes

@kristoffer.carlsson yes, thanks.

but I heard that Base.invokelatest() is slow. Could I do like this:

function f(x)
    my_model = nothing
    my_model = @eval (x) -> $(ex(x))
    println(x + 1, " ", my_model(x))
end

?