Hi,
I’m trying to load a trained Faster R-CNN model that has been exported with ONNX:
It has been saved like this:
torch.onnx.export(model,
x[None, :, :, :],
"detector.onnx",
export_params=True,
input_names=["input"],
output_names=["output"],
dynamic_axes={'input' : {0 : 'batch_size'}, 'output' : {0 : 'batch_size'}})
Loading it like this gives an error:
julia> path = "detector.onnx"
julia> data = Float32.(randn((1, 3, 145, 501)))
julia> out = ONNX.load(path, data)
┌ Error: Error while loading node ("/Split_output_0",) = Split("input", "/Constant_1_output_0")
│ axis => 0
└ @ ONNX ~/.julia/packages/ONNX/weSi3/src/load.jl:66
ERROR: UndefVarError: `args` not defined
Stacktrace:
[1] load_node!(tape::Umlaut.Tape{ONNX.ONNXCtx}, #unused#::ONNX.OpConfig{:ONNX, :Split}, inputs::Vector{Umlaut.Variable}, attrs::Dict{Symbol, Any})
@ ONNX ~/.julia/packages/ONNX/weSi3/src/load.jl:229
[2] load_node!(tape::Umlaut.Tape{ONNX.ONNXCtx}, nd::NodeProto, backend::Symbol)
@ ONNX ~/.julia/packages/ONNX/weSi3/src/load.jl:56
[3] load(io::IOStream, args::Array{Float32, 4}; backends::Vector{Symbol}, exec::Bool)
@ ONNX ~/.julia/packages/ONNX/weSi3/src/load.jl:310
[4] load
@ ~/.julia/packages/ONNX/weSi3/src/load.jl:267 [inlined]
[5] #98
@ ~/.julia/packages/ONNX/weSi3/src/load.jl:341 [inlined]
[6] open(f::ONNX.var"#98#99"{Vector{Symbol}, Bool, Tuple{Array{Float32, 4}}}, args::String; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Base ./io.jl:395
[7] open
@ ./io.jl:392 [inlined]
[8] #load#97
@ ~/.julia/packages/ONNX/weSi3/src/load.jl:340 [inlined]
[9] load(filename::String, args::Array{Float32, 4})
@ ONNX ~/.julia/packages/ONNX/weSi3/src/load.jl:339
[10] top-level scope
Does anyone know how to properly load the model?