Source code of a function

Hello,

Is it possible to recover the source code of a function as a string?
For example,

f(x,y) = 2sin(x+y)+1

Is there a function call on f that can get me s="2sin(x+y)+1"?

Thanks!

2 Likes

If you do

@which f(x,y)

it will print the file and line number where the function was defined (or it will tell you it was done in the REPL). Try, for example @which sin(Ď€).

2 Likes

In the REPL you could also try

@less f(x,y)

which wraps the call to the pager on the result from @which

1 Like

For reference, the source expression is available using the following:

Base.uncompressed_ast(methods(f).ms[1]).code

But right now it is necessary to go to the original code to get the unparsed version, as suggested above. It’s on the roadmap for 1.0; see this issue and others cross-referenced there.

4 Likes

Take a look at Sugar.jl, specifically to its macro_form() function. It first tries to find and parse function’s source, and only if it’s unavailable, falls back to parsing AST. macro_form often fails for functions generated using @eval (since source code includes interpolated values from the context) and may produce somewhat different results for functions defined in REPL, but in general I found this function extremely helpful.

4 Likes

Thanks! That’s exactly what I was looking for.

In particular, these lines did it:

using Sugar
method = Sugar.get_method(f, types)
code, str = Sugar.get_source(method)
2 Likes

It took me some time to realize that types is supposed to be a tuple.
So, if you want to get the source of a method with the type Vector{Float64} you need to write :
method = Sugar.get_method(f, (Vector{Float64),))

1 Like

well, the Gallium debugger seems to be capable of showing the source, either in a file or if entered from the REPL. how does it do it?

and, more importantly, in 1.0, what is the syntax that I should use to view the method source for function f?

regards,

/iaw

The rcnlee procedure with Sugar.jl works just fine…

1 Like

IIRC Gallium reparses the source.

I’m using Julia 1.0.2, it looks like I can not install Sugar package now.

|__/ |

julia> using Sugar
ERROR: ArgumentError: Package Sugar not found in current path:

  • Run import Pkg; Pkg.add("Sugar") to install the Sugar package.

Stacktrace:
[1] require(::Module, ::Symbol) at ./loading.jl:823

julia> import Pkg

julia> Pkg.add(“Sugar”)
Updating registry at ~/.julia/registries/General
Updating git-repo https://github.com/JuliaRegistries/General.git
Resolving package versions…
ERROR: Unsatisfiable requirements detected for package Matcha [8d673c98]:
Matcha [8d673c98] log:
├─possible versions are: [0.0.1-0.0.2, 0.1.0-0.1.1] or uninstalled
├─restricted by compatibility requirements with Sugar [e18849f4] to versions: [0.0.1-0.0.2, 0.1.0-0.1.1]
│ └─Sugar [e18849f4] log:
│ ├─possible versions are: [0.0.1-0.0.3, 0.1.0, 0.2.0, 0.3.0-0.3.1, 0.4.0-0.4.5] or uninstalled
│ └─restricted to versions * by an explicit requirement, leaving only versions [0.0.1-0.0.3, 0.1.0, 0.2.0, 0.3.0-0.3.1, 0.4.0-0.4.5]
└─restricted by julia compatibility requirements to versions: uninstalled — no versions left

Take a look at CodeTracking.jl.

5 Likes

Hello! Is Sugar.jl still supported nowadays?

1 Like

If we’re talking about this Sugar.jl, it was updated 3 months ago.

I know this is an old thread, but I was just looking for any discussion of this. Yes, it was updated in 2021, but the last release is from 2018, and the README now fails:

julia> Sugar.sugared(controlflow_1, (Int, Int), code_lowered)
ERROR: type Expr has no field typ
Stacktrace:
  [1] getproperty
    @ ./Base.jl:38 [inlined]
  [2] similar_expr(x::Expr, args::Vector{Vector{Any}})
    @ Sugar ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:6
  [3] _normalize_ast(expr::Expr)
    @ Sugar ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:108
  [4] replace_or_drop(f::typeof(Sugar._normalize_ast), drop::Sugar.var"#5#6", ast::Expr, result::Vector{Any})
    @ Sugar ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:32
  [5] replace_or_drop(f::Function, drop::Function, ast::Vector{Any}, result::Vector{Any})
    @ Sugar ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:25
  [6] replace_or_drop
    @ ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:24 [inlined]
  [7] replace_expr
    @ ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:21 [inlined]
  [8] normalize_ast
    @ ~/.julia/packages/Sugar/nhFic/src/ast_tools.jl:113 [inlined]
  [9] sugared(f::Function, types::Tuple{DataType, DataType}, stage::Function)
    @ Sugar ~/.julia/packages/Sugar/nhFic/src/sugarcoating.jl:92
 [10] top-level scope
    @ REPL[110]:1

Recovering an Expr from CodeInfo can be really useful, makes me wonder what it would take to get it working again…

As far as I know, there’s no 1-to-1 mapping between Expr and CodeInfo. But you can try to:

  1. Generate new Expr from CodeInfo. Simple function calls should be pretty similar to the original code, but complex expressions, control flow, macros and other high-level constructs will be lost.
  2. Find the path to the source code from lineinfo embedded in CodeInfo. However, this has never been reliably for me, even with Sugar.

If you describe your use case, I may have additional tips on one or both of these approaches.

1 Like

The use case is probabilistic programming. In the current release of Soss.jl, a Model is a block of Statements, each of which is either an Assign (lhs = rhs) or a Sample (lhs ~ rhs). I want to make this more flexible and allow control flow, especially for things like more complex dependencies among entries of an array.

I have a good start on this, mostly using ideas in this post. The current challenge is the question of how to update a sampled value and make sure to maintain consistency of the dependency graph.

It’s very popular to do this using something like a nested dictionary data structure to represent the trace. But this has a lot of overhead, and I’d much rather have something with smaller scope of use cases than something that’s slow. I had been thinking the “trace” could just be made of local variables, so it’s all in code with no data structure at all. At each ~, we could add a @label and a @goto, and the tilde function could pass back information about where to jump next.

Then I started poking around some more in @BenLauwens 's ResumableFunctions.jl. I had played with this before, but he’s now gotten it to be really fast. There are some cool ideas in the implementation:
https://benlauwens.github.io/ResumableFunctions.jl/dev/internals/

The mutable struct + state machine approach is really slick, and it’s really clever to use the typed code to get the types for the slots. But for my purposes, the name shadowing could get in the way. Reusing a variable name leads to one entry in the struct taking on multiple roles. This could lead to problems when we start jumping around.

Luckily, I think @thautwarm 's JuliaVariables.jl can be made to help with this. I opened a new issue about that here:
https://github.com/JuliaStaging/JuliaVariables.jl/issues/28

So I think the process would look something like

  1. Use JuliaVariables to solve for scopes and make sure names are unique
  2. Still at the AST level, make transformations like those described here to get us closer to the lowered representation
  3. Add @labels and @gotos to make it easy to jump between samples

At this point, we probably need to replace each ~ with a concrete tilde function, depending whether we want to call rand or logdensityof, or something else. Then

  1. Lower the code
  2. Use LoweredCodeUtils.jl to get the edges (Edges · LoweredCodeUtils)

Then, for any variable that’s updated, the edges tell us what else needs to be done. So maybe it’s making an abbreviated version of the code for each of these cases, or adding some jumps to cover each of these cases. Not really sure yet.

Then I don’t know the best way to “run” lowered code. It’s important to avoid world age issues, obviously. I’ve been very happy with GeneralizedGenerated, but if this lowered code is lifted back up to the AST level it will be much bigger and (IIUC) any closures will be gone. So maybe it would need RuntimeGeneratedFunctions.jl? Or maybe lowered code can be executed directly? I’m not sure yet about some of these things.

Ok, this response got pretty long. Maybe it’s a little too down-in-the-weeds, but if you have ideas or suggestions I’d enjoy a discussion :slight_smile:

A couple of thoughts:

  • In addition to LoweredCodeUtils, you might want to look at CodeInfoTools.jl, as well as other packages in the JuliaCompilerPlugins organization
  • It’s also not terribly hard to traverse CodeInfo objects directly, e.g. see the tracer code in Umlaut.jl which handles ~98% of Julia functions
  • It’s possible to recover loops from the lowered code and represent them as an operation with a subgraph with its own inputs and outputs, thus preserving the overall DAG structure and not dealing with variable renaming and the like. The code for loop recovery in Ghost.jl is pretty tough though, and I have a better idea how to do it in Umlaut.jl, but it is still in my veeery long todo list
  • To run the lowered code, I personally prefer to generate an Expr, compile it and call using Base.invokelatest(). IRTools has a built-in function to run its own IR, but IIRC it goes more or less the same way
2 Likes

Thanks, that looks great!

This looks like an interpreter, is that right? The stuff I’m doing here will be in inner loops, so it’s important to not have any extra overhead. Umlaut looks cool BTW.

Thanks, this sounds like a great idea.

I generally try to avoid eval, mostly because it requires global scope. invokelatest also used to have some overhead, but that may not be the case any more.

It’s a tracer - Umlaut goes through the CodeInfo line by line, executes each statement to get the value and records the statement and value to a DAG (tape). The tape can then be transformed and recompiled back to a normal function, so interpretation only happens once. Moreover, execution of each statement isn’t strictly required, you can as well traverse CodeInfo and generate Expr on the fly. The piece of code I linked just shows the most common cases you will encounter when traversing the CodeInfo object, what to do with it is completely up to you.

invokelatest also used to have some overhead, but that may not be the case any more.

IIUC, invokelatest(f, args...) tells Julia not to link to a specific version of f() during compilation, but instead go to the method table and get the latest version during runtime. Basically, it’s just a dynamic dispatch, and if you call it only once for the top-level function, its cost is relatively small. Maybe code generation or pre-compilation can reduce it even further, but in my experience it’s pretty complicated, fragile and generally isn’t worth it.