Save Flux model to ONNX?

How to save a Flux model to ONNX? Is there currently a way to do it?

1 Like

Seems to be under rework, have never tried it but you could probably just grab the last working version and later when they have reworked update to that.
https://github.com/FluxML/ONNX.jl

Did ONNX.jl actually support saving models? From looking at the repo it seems it supported only loading?

That might be very true, didn’t really look into it :sweat:

We plan to support both - saving and loading. The exact timeframe however is unclear at the moment.

2 Likes

Thanks! That would be awesome. Meanwhile I would still be interested in a hacky solution that “works” right now.

Unfortunately there’s no quick and dirty solution for getting ONNX support off the ground, modulo not using julia ML libraries. It’s a massive API to support, so aside from instantiating your own protobufs with ONNX.jl and manually writing weight params into them, there’s not much else that can be done.

is there anyway to do it via PyCall? the model itself isn’t large

Loading weights from an ONNX file is relatively easy, the hard part is to map the operations to Flux’s layers. For instance, the order of data dimensions in ONNX and Julia is reversed, indexing starts from different points, function signatures are just different sometimes, etc. PyCall can help with very few of these issues.

Fortunately, we move forward with ONNX.jl - many core operators are already supported, and now we have a tutorial on adding the new ones. I guess the simplest way is to just try loading your graph and implement / post feature requests for the missing parts.

maybe it would be nice to have a short tutorial / test for saving a Flux chain?

We don’t have integration with Flux yet, you’ll need to manually construct a Ghost.Tape with Base and NNlib primitives - see Ghost tutorial and the list of supported ops for details. In the future, it will be automated, but at the moment we mostly focus on adding new operators.

@dfdx : Doesn’t ONNX.jl support export of arbitrary functions (as long as they hit the primitives), including Flux layers?

Fwiw, I’m still maintaining GitHub - DrChainsaw/ONNXNaiveNASflux.jl: Import/export ONNX models which has primitives for most Flux layers, both export and import.

Importing to Flux.Chain is however not supported and I believe it is a bit of the same difficulty that ONNX.jl will face with imports: Chain is just so different compared to the more generic graph used by ONNX and I don’t think it can even represent all models which are representable in ONNX.

Exporting a Flux.Chain is ofc no problem.

1 Like

so I can export a Flux.Chain to be used by, say, Keras?

I’m using Julia but I want the collaborators who use C++ / Python to be able to use our trained model. I can separately save the Flux model in addition to ONNX as BSON. So unable to import to Flux.Chain is not a big problem for me

Yes, as long as there is a way to import an ONNX model in Keras.

Hi @jling, did you manage to export a Flux model or even better Lux model to ONNX using ONNXNaiveNASflux.jl or any other packages?