Putting non linear optimization in a real-time closed-loop

Hi,
I’m trying to implement an Energy Management Algorithm (QCP) I built using InfiniteOpt.jl.
The algo gets measurements from a WebSocket connection.
My main scripting function goes as:

function oneStepOpt_test1(sets::modelSettings)
    try
        # Open WebSocket client and connect to CEM
        HTTP.WebSockets.open("ws://127.0.0.1:8700") do ws
            # Use a flag to indicate connection status
            @info "Connected to CEM"
            HTTP.WebSockets.send(ws, "EMA") # Perform initial handshake
            @info "Sent initial handshake request: EMA"
            # Wait for handshake confirmation
            response = HTTP.WebSockets.receive(ws)
            # check if handshake was successful
            if String(response) == "EMA_ACK"
                @info "Handshake successful: "
                connection_status = Ref(true)
            else
                @info "Initial handshake failed: "
                connection_status = Ref(false)
            end

            connection_status[] ? nothing : error("Connection to CEM failed")
            while connection_status[]
                # 1. Get the initial state from CEM
                start_data_update_process(ws, connection_status)

                # 2. Build and solve optimization problem
                start_optimization_process(ws, connection_status, sets)
                sleep(10)  # Adjust sleep duration as necessary
            end
        end
    catch e
        println("Error: ", e)
        println("Reconnecting in 10 seconds...")
        sleep(10)
    end
end
function start_optimization_process(ws::HTTP.WebSocket, connection_status::Ref{Bool}, sets::modelSettings)
    @async begin
        warmstart_dict = Dict()
        while connection_status[]
                    try
                        # Call the solvePolicies function with the updated data
                        @info "Optimizing..."
                        results = solvePolicies(KNITRO.Optimizer, sets, shared_data, warmstart_dict) # Build and solve the InfiniteModel()
                        warmstart_dict = copy(results)
                        sets.dTime .+= (sets.Δt * 3600.0)                        
                        # 2.1 Extract setpoints from results
                        setpoints = Dict("actions" => results["actions"][1],  
                            # Predictions of Exogenous variables ̃B
                            "Btilde" => results["Btilde"][1], 
                            # Predictions of the State of Charge ̃S
                            "predStates" => results["states"][1],
                            )
                        println("Setpoints: ", setpoints)
                        # 3. Send setpoints to CEM
                        HTTP.WebSockets.send(ws, JSON.json(setpoints))
                        @info "Sent setpoints to CEM"
                        # sleep(20) <== inside try
                    catch inner_e
                        @error "Error during optimization process: $inner_e"
                    end
                    sleep(30) # <== inside while
        end # close while
    end # close async
end

The idea is to run in parallel the two main functions start_data_update_process and start_optimization_process. However, it seems that the sleep() inside the optimization function is being ignored. Causing the start_optimization_process to run faster than the other process.

Not a software engineer here, just a researcher trying to put its algorithm into production. Any good reference, advice or comment is more than welcome.

Try printing something after the call to sleep and you’ll notice that it does in fact sleep. The @async causes start_optimization_process to return immediately, what you are missing is probably that at some point you need to wait on the task returned by the @async call, something like this

opt_task = start_optimization_process(...)
...
wait(opt_task)

Note, though, that @async does not cause things to run in parallel, for that, you instead want Threads.@spawn. Also note the warning

help?> @async
  @async

  Wrap an expression in a Task and add it to the local machine's scheduler queue.

...

  │ Warning
  │
  │  It is strongly encouraged to favor Threads.@spawn over @async always even when no parallelism
  │  is required especially in publicly distributed libraries. This is because a use of @async
  │  disables the migration of the parent task across worker threads in the current implementation
  │  of Julia. Thus, seemingly innocent use of @async in a library function can have a large
  │  impact on the performance of very different parts of user applications.
1 Like