How to solve or optimize a mixed problem?

I have the following problem:

  • a wind farm
  • uncontrolled input: time and location-dependent wind speed
  • control input: time-dependent induction factor
  • set values: time-dependent power demand
  • output: total power

For simplicity, let’s assume that all turbines use the same control input, so the input and the output are scalar.

I want to use feed-forward control; the control signal depends on the time and location-dependent wind speed, multiplied by a time-dependent correction factor that is a spline with 6 control points.

Each of these control point values has a strong correlation with the power output in distinct time intervals, so we can say:

y_i \approx f(u_i)

And we try to minimize the error

e_i = r_i - y_i

where r(t) is the time-dependent demand, and r(i) the demand in the i-th time segment.

Running the simulation once is costly, like 2 to 3 seconds. Furthermore, the output is noisy. If I minimize the integral of the error over the simulation time, I need 500 to 1000 black box evaluations to find a solution. However, due to the strong correlation between u_i and y_i, I believe that a different algorithm might be faster.

Any suggestions?

Perhaps a stochastic root-finding method could be used?