Profiling crashing or perhaps non-responsive - where to report?

I am trying to profile a package I am working on. I have code that runs successfully when called normally, or with @time. When run under “@profile”, it runs partway and then seems to crash. The REPL becomes totally non-responsive, including to spamming CTRL-C.

Below is the code; the issue comes for me on the last line only. I am running Julia version 1.9.2 under WSL (linux in windows.) (I apologize in advance, one of the packages is very slow to load and compile.)

A few questions

  1. Where is the best place to report this? “Profile” doesn’t seem to have its own github.
  2. Any advice about profiling in another way that might not crash?
  3. I suspect that compilation is the vast majority of the effort here:
    109.134678 seconds (106.40 M allocations: 6.561 GiB, 3.63% gc time, 99.25% compilation time: <1% of which was recompilation). Can @profile explain why this is happening, or is there a better tool for that?
using ParameterEstimation
using ModelingToolkit, DifferentialEquations
solver = Tsit5()

@parameters a b
@variables t x1(t) x2(t) y1(t) y2(t)
D = Differential(t)
states = [x1, x2]
parameters = [a, b]

@named model = ODESystem([
                             D(x1) ~ -a * x2,
                             D(x2) ~ 1 / b * (x1),
                         ], t, states, parameters)
measured_quantities = [
    y1 ~ x1,
    y2 ~ x2,
]

ic = [1.0, 1.0]
p_true = [9.8, 1.3]
time_interval = [0.0, 2.0 * pi * sqrt(1.3 / 9.8)]
datasize = 9
data_sample = ParameterEstimation.sample_data(model, measured_quantities, time_interval,
                                              p_true, ic, datasize; solver = solver)
@time res = ParameterEstimation.estimate(model, measured_quantities, data_sample)

@profile res = ParameterEstimation.estimate(model, measured_quantities, data_sample)

This forum is a good starting point :slight_smile:

Can you try to isolate the cause of the crash? For instance by solving a simpler ODE and adding stuff until it fails?

That’s because when you run @time, the function has never been executed before, and so the compilation is basically all you measure. If you execute that line a second time, you should see a much smaller number.

Hi, just to follow up on this. I believe there are a couple of separate things going on.

  1. @profile does cause some “performance degradation”, which is known and to be expected. And perhaps it is reasonable for it to be 10x or 20x? I am not sure about that. So slow commands can be very very slow when I try to profile them.

As an example, the following is taking many minutes on my machine (perhaps it has crashed, I am not sure):

julia> using Profile
julia> @profile using ParameterEstimation

2)the terminal is not responding to ctrl-c. I have seen some other discussion of this (example: cannot-catch-ctrl-c-gracefully) Honestly this is not a big deal for me, the profiling problems are more concerning.

Are you sure the performance degradation is caused by profiling, and not just by very slow precompilation? As a side note, it is rather strange to try to profile a using statement, maybe that’s why Julia doesn’t like it. You can use the macro @time_imports for this situation

1 Like

Well, what I’m actually doing is more like @profile includet(“random_script.jl”) and the script includes using statements. I guess maybe I… shouldn’t do that (i.e. I should be profiling single function calls)? Or rather, it is perhaps a low-priority issue with @profile which I’d be happy to report and work around, if it is of interest. (This issue does generally force one to close the REPL and re-open it.)

Yeah, most probably. Each time you include code, you’re replacing old code, so Julia has to parse it, eval it, and even recompile the replaced code. With Revise/includet the situation is presumably even worse. I’m guessing this is not at all what you want to measure.

1 Like

I think I’m wading into the “ttfx” problem. @time_imports is a useful tool. I would like to know how to use it though; I’ll start a new thread as it’s unrelated to this topic. Thanks again.

1 Like