How to solve or optimize a mixed problem?

I have the following problem:

  • a wind farm
  • uncontrolled input: time and location-dependent wind speed
  • control input: time-dependent induction factor
  • set values: time-dependent power demand
  • output: total power

For simplicity, let’s assume that all turbines use the same control input, so the input and the output are scalar.

I want to use feed-forward control; the control signal depends on the time and location-dependent wind speed, multiplied by a time-dependent correction factor that is a spline with 6 control points.

Each of these control point values has a strong correlation with the power output in distinct time intervals, so we can say:

y_i \approx f(u_i)

And we try to minimize the error

e_i = r_i - y_i

where r(t) is the time-dependent demand, and r(i) the demand in the i-th time segment.

Running the simulation once is costly, like 2 to 3 seconds. Furthermore, the output is noisy. If I minimize the integral of the error over the simulation time, I need 500 to 1000 black box evaluations to find a solution. However, due to the strong correlation between u_i and y_i, I believe that a different algorithm might be faster.

Any suggestions?

Perhaps a stochastic root-finding method could be used?

Your post looks interesting, but it seems to be too generic (at least for me).
Do you have an explicit mathematical or physical model or code you may provide?
Otherwise, you can describe the general setting more in detail:

  • ODE / SDE / DAE / Universal or Neural Differential Equation + noise(?)
  • small or large scale, e.g. 3 or 1’000 equations (according to your simulation time “2 to 3 seconds”, I assume the latter one)
  • linear or nonlinear system

Which approaches (diff eq. solvers, optimizers) have you tried so far?
You mention the minimization of the integral error and black box evaluations. Do you mean something like
\min\limits_{u_{1}, \ldots, u_{n} \in \mathbb{R}^M} \sum_{i=1}^{n} \left\lVert e(t_{i}) \right\rVert^2 dt
where u_{i} are the control input signals for each time point?

I mean there are several methods (in OrdinaryDiffEq.jl) to simulate systems quickly or slowly [and precisely or not]; and in analog way we find in Optimization.jl fast and slow optimizers.

The model is freely available, including multiple papers and a PhD thesis describing the details: Home · FLORIDyn.jl

This is not an ODE or anything similar, but rather a complex aerodynamic model with a 4D result set (including time and a 3D wind field), somewhat optimized for speed.

I have already found and fixed one issue, which was that I optimized the relative power instead of the absolute power.

This is my work-in-progress test case: Add test case wind puls by ufechner7 · Pull Request #99 · ufechner7/FLORIDyn.jl · GitHub

And u_i are not the control signals at each time point, but parameters of a spline describing the control signals at each time point. I have about 1000 time points, but the number of the parameters of the spline is more like 10.

I use the NOMAD.jl blackbox optimizer.