Hi all,
I am currently working on a code that is supposed to fit a model to experimental coherent Anti-Stokes Raman spectra which has about 9 decision variables, depending on the case. Because the computations in this model are very expensive, I precompute the most expensive part, which fortunately only depends on one variable (temperature), in a spectral library and do the rest during the fitting algorithm.
The consequence of this approach is, that temperature is now only available as discrete values - those, that were used when generating the library. For the sake of this example, say one generates a library of spectra ranging from 300 K to 2400 K in 10 K increments.
In a former approach implemented in MATLAB, I used a mixed-integer genetic algorithm for the fitting which worked very nicely and had runtimes in the order of 500-1000 ms per spectrum.
I now wanted to port this to Julia (for reasons…) and I am kind of stuck when trying to get the fit to work…
My first approach was to treat temperature as an integer and using Juniper+Ipopt, which is painfully slow… So I tried to drop the idea of handling temperature as an integer and implemented interpolation between spectra and use Ipopt alone. However, when using nearest neighbor interpolation, the search space obviously flattens out between library entries, such that the derivative approaches zero, introducing local minima.
This only get’s a little better when using linear interpolation. Going to higher order is unfortunately not an option, because of the increased computational cost.
The question now is: is it possible to control the underlying AD (ForwardDiff) in a way that it computes the gradient across a larger interval? In this case, the one defined by the library temperature increment.
Or is there any way to control the minimum difference of a decision variable at two consecutive iterations? This would also help to reduce the effort put into optimizing another (continuous) decision variable to the 8th decimal in the case of a noisy spectrum.
I have to say, I am very new to automatic differentiation and my background is not computational engineering, so maybe I am thinking in the wrong direction here.
Providing a minimal working example is unfortunately not trivial, because this would require to either upload the library (which is in the order of several GB) or the entire code, which is not really minimal I guess…
I did try other approaches, such as Evolutionary.jl, which sort of works, but even in the absence of experimental noise fails to converge to the exact solution in most cases.
I really appreciate any hint in the right direction!
Max