@register_symbolic with vector variables

It seems that @register_symbolic will successfully stop tracing on scalar variables, but continues to trace with vector variables. Anyway currently to get register_symbolic to stop tracing on vector variables?

julia> f(x) = x.*x
f (generic function with 1 method)

julia> @register_symbolic f(x)

julia> @variables y(t) z(t)[1:3]
2-element Vector{Any}:

julia> f(y)

julia> f(z)
(broadcast(*, z(t), z(t)))[1:3]

Doh, answered my own question in the end.

julia> typeof(z)
Symbolics.Arr{Num, 1}

julia> @register_symbolic f(x::Symbolics.Arr{Num,1})

julia> f(z)
1 Like

Follow-up question:

If I were to register some function to then set to an MTK variable in DAE form, it might look something like:

@variables q(t)[1:3]
zeros(3) .~ q - f(z)

but that just results in

julia> q - f(z)
ERROR: MethodError: no method matching -(::Symbolics.Arr{Num, 1}, ::Num)
For element-wise subtraction, use broadcasting with dot syntax: array .- scalar

Defining promotion type gets closer but results in:

@register_symbolic f(x::Symbolics.Arr{Num,1})::Symbolics.Arr{Num,1}
julia> q - f(z)
ERROR: axes of (f(z(t))) not known
 [1] error(s::String)
   @ Base .\error.jl:35
 [2] axes(A::Symbolics.Arr{Num, 1})
   @ Symbolics C:\Users\jonda\.julia\packages\Symbolics\BQlmn\src\arrays.jl:528
 [3] promote_shape(a::Symbolics.Arr{Num, 1}, b::Symbolics.Arr{Num, 1})
   @ Base .\indices.jl:169
 [4] -(A::Symbolics.Arr{Num, 1}, B::Symbolics.Arr{Num, 1})
   @ Base .\arraymath.jl:7
 [5] top-level scope
   @ REPL[13]:1

Any way to define the function output axes when registering?

I’m not sure there is at this time @jondavis847

Yikes, okay thanks Chris. I went ahead and opened an issue with Symbolics.jl.

I was very excited to learn about and implement @register_symbolic. One of the main hurdles I’ve been battling with MTK is that as the functions and models get more complicated, the more difficult it becomes (for me) to implement in MTK. Mostly I’m struggling with complex conditionals that end up becoming massive IfElse.ifelse statements, vector and matrix operations with Symbolics, and at some point I was thinking I would need to implement more complex loops like a newton solver. Knowing that I could always fall back to @register_symbolic no matter how complex the model or function got is a huge boon for MTK in my opinion! Also, being able to use external julia packages rather than having to build my own Symbolics specific functions is massive. Hopefully there’s a way for this to work with vectors and matrices!

We will have this soon… If you want to take a crack at it now, you can have a look at @wrapped macro and how we implement some Symbolic array-related functions https://github.com/JuliaSymbolics/Symbolics.jl/blob/master/src/array-lib.jl#L237-L240

but it’s not going to be very ergonomic to create but you’d have to use this constructor to set up the array term with the right size https://github.com/JuliaSymbolics/Symbolics.jl/blob/master/src/arrays.jl#L365-L383


This is fantastic news @shashi ! Thank for the help and attention.

I was able to piece together a workaround based on your recommendations for an MWE and look forward to implementing more complicated functions.

using ModelingToolkit, DifferentialEquations

f(x) = x.*x
#must specify input and output type to register vector
@register_symbolic f(x::Symbolics.Arr{Num,1})::Symbolics.Arr{Num,1}

function g(x)
    @syms i::Int
    @arrayop (i,) f(x[i]) term = f(x)    

#testing necessary in-function and ODE workflow
function register_test()
    @variables t
    D = Differential(t)

    @variables y(t)[1:3] = zeros(3)
    @variables u(t)[1:3] = zeros(3)

    eqs = [
        D.(u) .~ ones(3)
        #appears we must convert from Symbolics.ArrayOp{Vector{Real}} 
        #to Symbolics.Arr{Num,1} to do subtraction
        #probably do this in g(x) but wanted to look at ArrayOp struct    
        zeros(3) .~ y - Symbolics.Arr(g(u))
    structural_simplify(ODESystem(eqs, t, name=:sys))

sys = register_test()

prob = ODEProblem(sys, [], (0, 10))
sol = solve(prob, Rodas4(),adaptive=false, dt = 1)

retcode: Success
Interpolation: specialized 3rd order "free" stiffness-aware interpolation
t: 11-element Vector{Float64}:
u: 11-element Vector{Vector{Float64}}:
 [0.0, 0.0, 0.0]
 [0.9999999999999991, 0.9999999999999991, 0.9999999999999991]
 [1.9999999999999982, 1.9999999999999982, 1.9999999999999982]
 [2.9999999999999973, 2.9999999999999973, 2.9999999999999973]
 [3.9999999999999964, 3.9999999999999964, 3.9999999999999964]
 [6.999999999999991, 6.999999999999991, 6.999999999999991]
 [7.999999999999989, 7.999999999999989, 7.999999999999989]
 [8.99999999999999, 8.99999999999999, 8.99999999999999]
 [9.999999999999991, 9.999999999999991, 9.999999999999991]

julia> sol[sys.y]
11-element Vector{Vector{Float64}}:
 [0.0, 0.0, 0.0]
 [0.9999999999999982, 0.9999999999999982, 0.9999999999999982]
 [3.999999999999993, 3.999999999999993, 3.999999999999993]
 [8.999999999999984, 8.999999999999984, 8.999999999999984]
 [15.999999999999972, 15.999999999999972, 15.999999999999972]
 [48.99999999999988, 48.99999999999988, 48.99999999999988]
 [63.99999999999983, 63.99999999999983, 63.99999999999983]
 [80.99999999999982, 80.99999999999982, 80.99999999999982]
 [99.99999999999983, 99.99999999999983, 99.99999999999983]

Trying a few more complicated ideas shows some nuance to this solution as expected. I suspect it works great for pure array operations, but if for example I had a function that took in a scalar input and returned a matrix output, there isn’t really a way I see to express that with @arrayop.

In my opinion the ideal solution would be to simply specify the output size during @register_symbolic, and then for Symbolics to fully trust that function without any further tracing, expecting errors external to Symbolics for incorrect inputs at runtime.

Thanks again for the help!

I’m going to add something like this:

@register_symbolic f(a::AbstractMatrix,c::AbstractVector)::AbstractArray{Real} begin
   size => ...
   container_type => ...
   eltype => ...

where to the arguments will be defined in the begin to make it easy to compute the necessary metadata.


Excellent thanks @shashi !

Love this Shashi!