Adding external parameters to neural network

Is there any way in Lux.jl or Flux.jl to create a function for a UDE and add some parameter, for instance the parameter c in the function rhs!(...) to a neural network so that the parameter c is also estimated through the neural network?

function rhs!(du, u, p, t)
    û = dudt2(u, p, st)[1]
    du[1] = û[1] + log.(c./u[1])
    du[2] = û[2] 
    du[3] = û[3]


yes you can. Set it up like Here is a pseudocode

# `c` will have to be an array for it to be trainable
my_custom_model = @compact(; model, c = [2.0]) do x, ps
   # Note that state handling is automatic with `@compact`
    function rhs!(du, u, p, t)
       û = model(u, p.model)  # Parameters accessible via p.<fieldname> 
       du[1] = û[1] + log.(p.c[1]./u[1])  # p.c[1] since we set it up as an array
       du[2] = û[2] 
       du[3] = û[3]

    prob = ....
    return solve(prob, ....)

ps, st = Lux.setup(rng, my_custom_model)
# ps.c and ps.model are automatically populated

my_custom_model(x_input, ps, st)

There is a new tutorial that is going to be merged this week, which demonstrates how to do UDEs with a clean syntax Solving Optimal Control Problems with Symbolic Universal Differential Equations | Lux.jl Documentation


Thank you for the pseudocode.

I have some follow-up questions: I am getting an error when calling:

my_custom_model(x_input, p, st)

That states type NullParameters has no field model. Note that in my case:

 model =  Lux.Chain(Lux.Dense(3, 8, swish), Lux.Dense(8, 3))

I also have a function that updates the network parameter values as shown below:

function predict_node(θ)
    prob = ODEProblem{true}(my_custom_model, u0, tspan)
    _prob = remake(prob, p = θ)
    Array(solve(_prob, Vern7(), saveat = tsteps, sensealg = InterpolatingAdjoint(; autojacvec = ZygoteVJP())))

I am wondering whether it is possible to output the extended model in the function my_custom_model rather than the solution to the model?

The custom model code I shared already encapsulates the odeproblem so you don’t need to define the predict function ( you effectively created a neural ode inside a neural ode iiuc)

The null params error is because you did not specify parameters p to the odeproblem struct

It seems that when I give structure to the RHS the performance becomes significantly worse rather than using a pure neural ODE. It has a lot of difficulties estimating the parameter c and is largely dependent on the initialization. Is there a way to train the network and the estimation of the parameter c separately?

For this problem, you can make c a scalar, so that it doesn’t get treated as a trainable parameters.

Unfortunately there isn’t a nice way to mark it as “fixed” any other way. I had created an issue on this Allow "const" arrays as inputs to `@compact` · Issue #588 · LuxDL/Lux.jl · GitHub but haven’t gotten around to fixing it yet