NeuralODE not taking multiple inputs although neural network is able to do so

(this question is related to How to use multiple inputs in `NeuralODE` object? with a concrete example)

Hi, I am trying to define a NeuralODE that takes two inputs (one input is a Float array, another is an Int array). First I define a neural network as derivative:

using Flux
# following struct is used to take two inputs
struct TwoInputsLayer   # layer take two inputs and aggregate with `op`
    layer1
    layer2
    op     # operation to aggregate them
end

(m::TwoInputsLayer)(x) = m.op(m.layer1(x[1]), m.layer2(x[2]))
Flux.@functor TwoInputsLayer

dudt = Chain(TwoInputsLayer(Dense(10, 16), Flux.Embedding(5, 16), +), Dense(16, 4));

This derivative neural network dudt works fine that following code

emb_input = rand(1:5, (100))
dense_input = rand(10, 100);
dudt((dense_input, emb_input))

gives outputs correctly:

4×100 Matrix{Float64}:
  2.06809   0.692935  -0.0242472  …  -0.492044   0.197574   1.71923
 -1.1337    1.1563     1.05582       -0.27764    0.596584  -1.46373
 -1.33768  -0.357405   0.0795294     -0.257927  -0.345279  -0.933275
 -2.12861  -0.844005   0.0125842      0.270249  -0.304456  -1.40022

However, when I pass dudt to construct a NeuralODE:

tspan = (0., 1.)
t = range(0., 1., length=10)
n_ode = NeuralODE(dudt, tspan, Tsit5(), saveat=t, reltol=1e-3, abstol=1e-5)

and pass the same inputs:

n_ode((dense_input, emb_input))

I got following error:

StackOverflowError:

Stacktrace:
 [1] recursive_unitless_bottom_eltype(a::Type{Any}) (repeats 79984 times)
   @ RecursiveArrayTools ~/.julia/packages/RecursiveArrayTools/cbsoB/src/utils.jl:91

If I remove parathesis:

n_ode(dense_input, emb_input)

I still have following error:

BoundsError: attempt to access 100-element Vector{Int64} at index [1:160]

Does anyone know what is the correct way to pass two inputs to NeuralODE? Here we only want the derivatives for first input dense_input not on second one emb_input since emb_input is an Int array.

Any ideas or suggestions would be much appreciated, thanks!