Is there any on-going effort to provide a Julia interface to the following?
Metal Performance Shaders (MPS) is a PyTorch framework backend for GPU training acceleration, providing scripts and capabilities to set up and run operations on Mac.
MLX is an array framework for machine learning research on Apple silicon, by Apple machine learning research. MLX provides FFTs, Linear Algebra operations Cholesky, inv, qr, svd, and includes a list of of examples:
MPS is not a PyTorch Framework. MPS are implementations of common operations as Metal (GPU) kernels - one sub set of which is implementing NN/CNN operations. In addition, PyTorch has a MPS back-end which uses the MPS kernels to accelerate PyTorch operations etc.
That API seems the way to go, to wrap (with a JLL) if not already done, should be simple, you could also call without a JLL.
High-level API to that would be of help, e.g. to Wisper, not just for Apple hardware, i.e. use that code there, and other alternatives elsewhere.
It’s good enough for most people calling Python with PythonCall.jl for stuff like that, though might be better to call some such (cross-platform) Rust wrapper or other, if not done in Julia.
MLX has a Python API that closely follows NumPy. MLX also has fully featured C++, C, and Swift APIs, which closely mirror the Python API. MLX has higher-level packages like mlx.nn and mlx.optimizers with APIs that closely follow PyTorch to simplify building more complex models.