Import ONNX model

Hi.

I am having issues including ONNX models in my code.

mnistnet = ONNX.load_model("/user/mp/home/models/vision/classification/mnist/model/mnist-1.onnx")
weights = ONNX.load_weights("/user/mp/home/models/vision/classification/mnist/model/weights.bson")
model = include("/user/mp/home/models/vision/classification/mnist/model/model.jl")
ERROR: syntax: unexpected semicolon in tuple around /user/mp/home/models/vision/classification/mnist/model/model.jl:4
Stacktrace:
 [1] top-level scope at /user/mp/home/models/vision/classification/mnist/model/model.jl:4

The model.jl file is the following.

    using Statistics 
    Mul(a,b,c) = b .* reshape(c, (1,1,size(c)[a],1)) 
    Add(axis, A ,B) = A .+ reshape(B, (1,1,size(B)[1],1)) 
    ((c_1 = reshape(var"weights[\"Constant312\"]", broadcast(Int64, Tuple([256, 10]))), c_2 = MaxPool((3, 3), var"pad=(0, 0, 0, 0)", var"stride=(3, 3)"), c_3 = CrossCor(var"weights[\"Constant340\"]", Float32[0.0], relu, var"stride=(1, 1)", var"pad=(2, 2, 2, 2)", var"dilation=(1, 1)"), c_4 = MaxPool((2, 2), var"pad=(0, 0, 0, 0)", var"stride=(2, 2)"), c_5 = CrossCor(var"weights[\"Constant321\"]", Float32[0.0], relu, var"stride=(1, 1)", var"pad=(2, 2, 2, 2)", var"dilation=(1, 1)"), c_6 = reshape(reshape(var"weights[\"Constant318\"]", broadcast(Int64, Tuple([8, 1, 1]))), broadcast(Int64, Tuple([8]))), c_7 = reshape(reshape(var"weights[\"Constant346\"]", broadcast(Int64, Tuple([16, 1, 1]))), broadcast(Int64, Tuple([16]))), c_8 = broadcast(Int64, Tuple([1, 256])), ((x_9,)->(c_1 * reshape(c_2(relu.(c_3(c_4(relu.(c_5(Div(0, x_9, var"weights[\"Constant377\"]")) .+ c_6))) .+ c_7)), c_8) .+ var"weights[\"Constant367\"]",;));),;)
1 Like

The error seems unrelated to BSON.jl. It seems to be in the 4th line of your models.jl file. Trying to rewrite that line:

((c_1 = reshape(var"weights[\"Constant312\"]", broadcast(Int64, Tuple([256, 10]))),
  c_2 = MaxPool((3, 3), var"pad=(0, 0, 0, 0)", var"stride=(3, 3)"),
  c_3 = CrossCor(var"weights[\"Constant340\"]", Float32[0.0], relu, var"stride=(1, 1)", var"pad=(2, 2, 2, 2)", var"dilation=(1, 1)"),
  c_4 = MaxPool((2, 2), var"pad=(0, 0, 0, 0)", var"stride=(2, 2)"),
  c_5 = CrossCor(var"weights[\"Constant321\"]", Float32[0.0], relu, var"stride=(1, 1)", var"pad=(2, 2, 2, 2)", var"dilation=(1, 1)"),
  c_6 = reshape(reshape(var"weights[\"Constant318\"]", broadcast(Int64, Tuple([8, 1, 1]))), broadcast(Int64, Tuple([8]))),
  c_7 = reshape(reshape(var"weights[\"Constant346\"]", broadcast(Int64, Tuple([16, 1, 1]))), broadcast(Int64, Tuple([16]))),
  c_8 = broadcast(Int64, Tuple([1, 256])), 
  ((x_9,)->(c_1 * reshape(c_2(relu.(c_3(c_4(relu.(c_5(Div(0, x_9, var"weights[\"Constant377\"]")) .+ c_6))) .+ c_7)), c_8) .+ var"weights[\"Constant367\"]",;));),;)

It looks like you are trying to load the weights into layers and construct an anonymous function to put everything together. I am not sure why this is written as a bunch of tuples? Here is what I think you are trying to do:

c_1 = reshape(var"weights[\"Constant312\"]", broadcast(Int64, Tuple([256, 10])))
c_2 = MaxPool((3, 3), var"pad=(0, 0, 0, 0)", var"stride=(3, 3)")
c_3 = CrossCor(var"weights[\"Constant340\"]", Float32[0.0], relu, var"stride=(1, 1)", var"pad=(2, 2, 2, 2)", var"dilation=(1, 1)")
c_4 = MaxPool((2, 2), var"pad=(0, 0, 0, 0)", var"stride=(2, 2)")
c_5 = CrossCor(var"weights[\"Constant321\"]", Float32[0.0], relu, var"stride=(1, 1)", var"pad=(2, 2, 2, 2)", var"dilation=(1, 1)")
c_6 = reshape(reshape(var"weights[\"Constant318\"]", broadcast(Int64, Tuple([8, 1, 1]))), broadcast(Int64, Tuple([8])))
c_7 = reshape(reshape(var"weights[\"Constant346\"]", broadcast(Int64, Tuple([16, 1, 1]))), broadcast(Int64, Tuple([16])))
c_8 = broadcast(Int64, Tuple([1, 256]))

f = x_9 -> c_1 * reshape(c_2(relu.(c_3(c_4(relu.(c_5(Div(0, x_9, var"weights[\"Constant377\"]")) .+ c_6))) .+ c_7)), c_8) .+ var"weights[\"Constant367\"]"

Of course, this just assigns the anonymous function to f. It would be better if you refactored your models.jl file to contain a function that accepts weights as an argument, then return a new model.

Got another error:

ERROR: UndefVarError: weights["Constant312"] not defined

Where my weights are:

julia> weights
Dict{String,Any} with 7 entries:
  "Constant340" => Float32[-0.0485564 0.0842415 … 0.0698724 0.00338067; -0.0910136 -0.168389 … 0.000768774 0.251301; … ; -0.128403 -0.0190255 … …
  "Constant321" => Float32[-0.00890567 -0.591976 … -0.159537 0.0384988; -0.236907 -0.475285 … 0.557541 0.226019; … ; -0.0645618 0.768216 … -0.29…
  "Constant312" => Float32[0.0916329 -0.0484465 -0.192239 0.018026; 0.121436 0.136893 0.0943201 0.0242815; … ; -0.147887 -0.167316 0.0317782 0.0…
  "Constant318" => Float32[-0.16154, -0.433836, 0.0916414, -0.0168522, -0.0650264, -0.131738, 0.0204176, -0.12111]
  "Constant367" => Float32[-0.044856; 0.00779166; … ; 0.0843221; -0.0545404]
  "Constant346" => Float32[-0.0822488, -0.108869, -0.14104, -0.204869, -0.179136, -0.215438, -0.133805, -0.195725, -0.268251, -0.258212, -0.0761…
  "Constant377" => fill(255.0)

Everywhere it says var”weights”… should just be weights[…]