Slow Deployment of Flux Mode

Hi. I have a trained Flux model which I wish to apply repeatedly using separate calls from an external program.

Since Julia1.5 (and corresponding Flux version) every time the function
is called, the line

using Flux

requires about 15 seconds to ‘initialize the CUDA driver’.

I do not even wish to use CUDA, and anyway, this deployment would only require milliseconds to compute with CPU.

Is there a way to import Flux without this (painfully slow) overhead each time I want to deploy the model?

Thanks,

DavidE
PS The time taken for this is variable… sometimes it take up to a minute to just get past ‘using Flux’.

If you want to deploy your model I would start julia once and load your model and packages once and then call a process externally in an already running julia session. The easiest way to achieve this is to wrap your model in a rest API. Not sure that’s what you wanted but anyway that’s my two cents.

1 Like

Thanks. That sounds helpful. How does a rest API work?
Does that mean pushing to an idle Julia session somehow?
[I see something about Web API’s, but I am not talking about that, but rather just executing from a unix shell]

What about DaemonMode.jl?

2 Likes

Just what I wanted!

Thanks!

Then you’re definitely better off with @Elrod suggestion. :+1:t2: