What would be a nice way to call Julia from Python using a different process?

I have a small number of Julia functions that I want to call from Python. We tried PythonCall which works for a toy example, but not when integrated in the full Python program, which might be due to library incompatibilities or multi-threading.

Is there a simple way to call a Julia function using inter-process communication?

The two main Julia functions we want to call are:

function step(;v_ro = nothing, set_torque=nothing, v_wind_gnd=set.v_wind, wind_dir=0.0, depower=0.0, steering=0.0)
    # do some work
    nothing
end

and

function sys_state()
    SysState(kps4)
end

SysState is a large struct which would in Python best be represented as a dict with elements of different types.

Two questions:

  1. What would be the best low level communication protocol? I thought of UDP or ZMQ.
  2. How can I serialize the struct in Julia and de-serialize it in Python?

A call overhead of 1ms would be fine, but not much more.

I haven’t tried it, but I found GitHub - joshbode/zmq-example: Example connecting Python to Julia and R using ØMQ which seems to do exactly that. It’s very old but the code looks basically runnable to me (and very simple).

Thanks for the link, but it does not do any serialization/ deserialization.

I thought about using GitHub - OxygenFramework/Oxygen.jl: 💨 A breath of fresh air for programming web apps in Julia because it has nice serialization included, but I would not know how to use it from Python.

I would use Arrow for cross-language serialization, or define some manual transformations between your data structures and JSON and implement them on both sides (using JSON3/StructTypes on the julia side).

1 Like

I am trying Oxygen.jl . Not working:

using Oxygen, HTTP

function start_server()
    @get "/greet" function(req::HTTP.Request)
        return "hello world!"
    end

    # start the web server
    serve()
end

start_server()

Client:

In [1]: %run simulate_http.py
---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
~/repos/pykitesim/simulate_http.py in <module>
      4 connection = http.client.HTTPConnection('http://127.0.0.1:8080')
      5 
----> 6 connection.request('GET', '/greet')
      7 
      8 response = connection.getresponse()

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/http/client.py in request(self, method, url, body, headers, encode_chunked)
   1254                 encode_chunked=False):
   1255         """Send a complete request to the server."""
-> 1256         self._send_request(method, url, body, headers, encode_chunked)
   1257 
   1258     def _send_request(self, method, url, body, headers, encode_chunked):

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/http/client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1300             # default charset of iso-8859-1.
   1301             body = _encode(body, 'body')
-> 1302         self.endheaders(body, encode_chunked=encode_chunked)
   1303 
   1304     def getresponse(self):

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/http/client.py in endheaders(self, message_body, encode_chunked)
   1249         else:
   1250             raise CannotSendHeader()
-> 1251         self._send_output(message_body, encode_chunked=encode_chunked)
   1252 
   1253     def request(self, method, url, body=None, headers={}, *,

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/http/client.py in _send_output(self, message_body, encode_chunked)
   1009         msg = b"\r\n".join(self._buffer)
   1010         del self._buffer[:]
-> 1011         self.send(msg)
   1012 
   1013         if message_body is not None:

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/http/client.py in send(self, data)
    949         if self.sock is None:
    950             if self.auto_open:
--> 951                 self.connect()
    952             else:
    953                 raise NotConnected()

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/http/client.py in connect(self)
    920     def connect(self):
    921         """Connect to the host and port specified in __init__."""
--> 922         self.sock = self._create_connection(
    923             (self.host,self.port), self.timeout, self.source_address)
    924         self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/socket.py in create_connection(address, timeout, source_address)
    785     host, port = address
    786     err = None
--> 787     for res in getaddrinfo(host, port, 0, SOCK_STREAM):
    788         af, socktype, proto, canonname, sa = res
    789         sock = None

~/repos/pykitesim/.pixi/envs/default/lib/python3.8/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    916     # and socket type values to enum constants.
    917     addrlist = []
--> 918     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    919         af, socktype, proto, canonname, sa = res
    920         addrlist.append((_intenum_converter(af, AddressFamily),

gaierror: [Errno -2] Name or service not known

In [2]: 

Any idea?

In the browser this URL works:
http://127.0.0.1:8080/greet

OK, this works:

import http.client

connection = http.client.HTTPConnection('127.0.0.1:8080')

def remote_call():
    connection.request('GET', '/greet')
    response = connection.getresponse()
    return response.read()

response = remote_call()
print(response.decode())

Calling remote_call() takes 800us to 900us on my laptop, 350us on my desktop.

Next step: Use json.

With JSON it really works perfect:

using KiteModels, KitePodModels, Oxygen, HTTP, StructTypes

set::Settings = deepcopy(se())
kcu::KCU = KCU(set)
kps4::KPS4 = KPS4(kcu)

# Add a supporting struct type definition so JSON3 can serialize & deserialize automatically
StructTypes.StructType(::Type{SysState}) = StructTypes.Struct()

function init()
    global integrator
    integrator = KiteModels.init_sim!(kps4, stiffness_factor=0.5, prn=true)
    nothing
end

function start_server(log=true)
    @get "/sys_state" function(req::HTTP.Request)
        SysState(kps4)
    end

    @get "/init" function(req::HTTP.Request)
        init()
    end

    # start the web server
    if log
        serve()
    else
        serve(access_log=nothing)
    end
end

The Python client:

import http.client
import json

connection = http.client.HTTPConnection('127.0.0.1:8080')

def init():
    connection.request('GET', '/init')
    response = connection.getresponse()
    obj = json.loads(response.read())
    return obj

def sys_state():
    connection.request('GET', '/sys_state')
    response = connection.getresponse()
    obj = json.loads(response.read())
    return obj

print(init())
state = sys_state()
print(state)

And the performance is much better than I said in my last post, I did not benchmark correctly:

ipython
%timeit sys_state()
# result: 63.3 µs ± 3.12 µs per loop on Ryzen 7850X
#         99.9 µs ± 5.05 µs per loop on Laptop in performance mode

UPDATE: The final code is available here: GitHub - ufechner7/pykitemodels: Kite power system models for Python

3 Likes