Synchronous requests needs to be performed one at a time on remote worker

We want to prevent race conditions on remote workers serving requests from HTTP.jl on porocess 1. We are trying to stack the “requests” in a channel… trying to figure out how to do this with remotecall_fetch and Channels ….

We need the Process 1 to wait for results … but the worker can only do 1 request at a time … so far this issue is put! waits for the channel to take the request but does “not wait for the results"

Many many thanks ! Chuck

Process 1 - API Gateway

function handle_qreq

(res, scen_cubes) =  remotecall_fetch(query_cubes_task,  2 , qreq)   # ==> need this to be synchronous and wait for response 

end

Worker 2 - Query Service

const query_channel = Channel(1)  # ==> channel of 1 so that this worker only does 1 thing at a time sequentially 

function start_query_listening()

try

while true

    qreq =  take!(query_channel) # => forever take!  work off the channel

res, scen_cubes = query_cubes(qreq)

end

catch ex

println(stacktrace())

println(stacktrace(catch_backtrace()))

end

end

function query_cubes_task(qry_req::QReq)

put!(query_channel, qry_req) # => put! Does not return the data to Worker 1 … put waits in the channel to START work not Finish

end

How can we server synchronous requests that get dispatched to worker processes from HTTP.jl … the same worker may get 2 requests simultaneously. Each worker needs a “queue” of work and the main caller needs to wait for response. Maybe channels are not the answer

I’m not quite sure what you are trying to do. Are you saying that the remote processes receive the HTTP requests, a single process handles those requests, and pushes the response back out to the remote?

My inclination would be to use multiple RemoteChannels. When a remote process receives a request it packages up the request on the “main” RemoteChannel along with a newly created “response” RemoteChannel. The process will then wait for a response on the created RemoteChannel before finishing the request.

The central processor instance reads from the main RemoteChannel to get the operation to perform, once it is done it sends the result back on the response channel provided by the remote client for that request. This way everything is serialized through the main RemoteChannel and the remote processes can wait on the created “resposne” channel before completing the request to the client.

Does that kind of make sense? Or did I miss the point?

This should solve it. Or, a tcp/unix socket can also be used seamlessly to read/write serialized data.

Yes the RemoteChannel for req and then one for Response seems like a very good option. We are using a reentrant lock on the main function of the worker to make it process 1 and a time. So far it is working. Thanks for reply!