Multi-GPU inference in Flux.jl

Flux allows for parallel training with multiple GPUs: GPU Support · Flux

In my use case, I need to run inference of a single model on multiple GPUs. Is there a way to load balance inference calls to the model so all GPUs are maximally utilized?