Thank you so much for the timely responses. Just to provide some context, I’m using Python to do some consulting work for a client. This is why I don’t share my code unmodified. Just for my own interest, I decided to recreate this code in Julia. Julia appeals to me as I have a lot of experience in MATLAB which I no longer use (too expensive now that I don’t have a company buying it).
I found the documentation for Interpolations too terse. For example your explanation with the code:
itp2 = interpolate((x,), y, Gridded(Linear()))
was helpful and I didn’t find it in the docs. I gather the (x,) transposes the x vector for use with 1D fitting.
That said, I sort of need the extrapolation since the model for the curve fitting looks like this:
pval(x,p) = p[1]*fitA(x.+p[3]).+p[2]*fitB(x.+p[4]);
Each fit function models experimentally obtained dye responses (fluorescence lifetimes). The actual data I’m using is much higher resolution than the example I posted. The data has both magnitude and time variance since it is real data. Since the fits can be shifted left or right, I get errors without extrapolation.
What is strange is that the Python code runs in about 40 msec and the equivalent Julia code runs in 500 msec. The problem is either in the interpolation functions, the model or the curve fitting function. Here is the two curve fitting functions:
Python (leastsq is from scipy.optimize):
fit = leastsq(residuals, p0, args=(y, x), maxfev=2000)
Julia (curve_fit is from LstSq):
fit = curve_fit(pval, x, y, p0)
Both functions claim to use Levenberg-Marquardt. It is possible the the Optim package has a better solver and I can try to use that, but to start, I was trying to write comparable code.
I’m finding profiling a bit difficult. In MATLAB the profiler ranks code by the total length of time it takes to execute. This makes it easy to find where to spend time optimizing. The Profile package in Julia lists which code runs the most, but not how long it takes.