Profiling code reveals that most time is used calling 3 functions in task.jl

I profiled my call to a function and got the following plot:

Those first 3 long horizontal lines are calls to task_done_hook, then wait, then poptaskref (in ascending order). This happens even for the 2nd or 3rd call to my function, and I don’t understand what they could be. I stopped specifying a lot of the types used / returned in my functions after a while since I was losing time doing so properly - could this be the cause? How could I find out?

Did you do Profile.clear() inbetween?

I stopped specifying a lot of the types used / returned in my functions after a while

This should not impact performance.

1 Like

You can also give C = true to ProfileView to see if there are C-frames hiding in the dark… eeh I mean in the light.

2 Likes

I used @profview, which includes the profile clear command:
@profview f(args...) is just shorthand for Profile.clear(); @profile f(args...); ProfileView.view()

I did that, and now I see that the last function which is called (topmost line on the left hand side) is libpthread.so.0, pthread_cond_wait: line -1; - any help interpreting that?

EDIT: There’s also a function call before that which is marked in red (which is bad as I understand): gf.c, jl_apply_generic: line 2191

Are you running multiple julia threads? I found similar results when profiling anything if julia was launched with more than one thread. Profiling seems to work properly for me only when julia is launched with a single thread. If you’re using Juno, note that it launches julia with multiple threads by default. See the following, for example:

I also remember getting that big bar when running things with multiple threads a long time ago.

Huh… If that’s the case, I guess my code really is just quite slow then! Will report back

So I set ENV["JULIA_NUM_THREADS"] = 1 in the REPL, ran my script and I get the same problem. I added the line @show ENV["JULIA_NUM_THREADS"] in my script and it correctly shows that I’m using 1 thread.

However, I also just wrote a bunch of print statements in my function to see how long it would take them to appear and it seems that my function really is just slow (it makes an optimisation problem so it takes about 5 seconds to run) :frowning: But at least now I know

I have the same “kind” of problem.

  1. Setting the thread number to 1 removes a spurious pink/salmon/ping part in the profile view.
  2. Once the thread number is set to 1, there is still a pink/red/pink “spurious” part that does not correspond to real time (IMHO) because the measured time is proportional to my real iteration number.
  3. One thing that could be nice is to add a scale on the profile view axis (e.g. time in second for x axis)
  • pink
 /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.2/REPL/src/REPL.jl, macro expansion: line 118
  • red
  ./task.jl, #26: line 268
/buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.2/REPL/src/REPL.jl, eval_user_input: line 86
  • pink
./boot.jl, eval: line 330

Profile with thread=1

Profile with thread=4

Note that the actual (@time) time is always the same (1.6 seconds)

OK, I have found that this spurious thing does appear when I use Juno. From the Julia REPL I have the following (correct) picture :wink:

1 Like

I can replicate this problem with

  • julia 1.3, also julia 1.4 branch
  • versions
    (@v1) pkg> st --manifest ProfileView
    Status `~/.julia/environments/v1/Manifest.toml`
      [159f3aea] Cairo v1.0.2
      [5ae59095] Colors v0.11.2
      [5789e2e9] FileIO v1.2.1
      [a2bd30eb] Graphics v1.0.1
      [4c0ca9eb] Gtk v1.1.2
      [27996c0f] GtkReactive v1.0.2
      [8197267c] IntervalSets v0.3.2
      [c46f51b8] ProfileView v0.5.3
      [b77e0a4c] InteractiveUtils 
      [9abbd945] Profile 
      [cf7118a7] UUIDs 
    
  • environment variable JULIA_NUM_THREADS set to 4 or 8.
  • Linux

Setting the latter to 1 fixes things.

I am not sure where to report this, Julia or ProfileView?

Because Juno sets the number of Julia CPUs.

I think profiling threaded code just doesn’t work well right now.