I’ve got a simple struct that looks something like this:
mutable struct variable{T,N} <: AbstractArray{T,N}
values :: Array{T,N}
spaceIDs :: Vector{Int}
end
I’ve defined Base.:+
, as well as Base.:*
and Base:./
(with scalar), as well as the basic AbstractArray
interface functions. I define an ODEProblem with an in-place function (that does nothing) and pass it to solve. It works great (even when the passed problem actually does something to the variables) if I don’t include the saveat option. However soln = solve(prob,saveat=0.1);
throws a conversion error with the start of the stacktrace:
Stacktrace:
[1] push!(a::Vector{AMPS.vars.variable{Float64, 2}}, item::Matrix{Float64})
@ Base ./array.jl:928
[2] copyat_or_push!(a::Vector{AMPS.vars.variable{Float64, 2}}, i::Int64, x::Matrix{Float64}, nc::Type{Val{false}})
@ RecursiveArrayTools ~/.julia/packages/RecursiveArrayTools/e9IdM/src/utils.jl:79
[3] _savevalues!(integrator::OrdinaryDiffEq.ODEIntegrator{OrdinaryDiffEq.Tsit5, true, AMPS.vars.variable{Float64, 2}, Nothing, Float64, SciMLBase.NullParameters, Float64, Float64, Float64, Float64, Vector{AMPS.vars.variable{Float64, 2}}, SciMLBase.ODESolution{Float64, 3, Vector{AMPS.vars.variable{Float64, 2}}, Nothing, Nothing, Vector{Float64}, Vector{Vector{AMPS.vars.variable{Float64, 2}}}, SciMLBase.ODEProblem{AMPS.vars.variable{Float64, 2}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, SciMLBase.ODEFunction{true, Main.finitediff.var"#test2D#2", LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}, Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, SciMLBase.StandardODEProblem}, OrdinaryDiffEq.Tsit5, OrdinaryDiffEq.InterpolationData{SciMLBase.ODEFunction{true, Main.finitediff.var"#test2D#2", LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}, Vector{AMPS.vars.variable{Float64, 2}}, Vector{Float64}, Vector{Vector{AMPS.vars.variable{Float64, 2}}}, OrdinaryDiffEq.Tsit5Cache{AMPS.vars.variable{Float64, 2}, AMPS.vars.variable{Float64, 2}, AMPS.vars.variable{Float64, 2}, OrdinaryDiffEq.Tsit5ConstantCache{Float64, Float64}}}, DiffEqBase.DEStats}, SciMLBase.ODEFunction{true, Main.finitediff.var"#test2D#2", LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}, OrdinaryDiffEq.Tsit5Cache{AMPS.vars.variable{Float64, 2}, AMPS.vars.variable{Float64, 2}, AMPS.vars.variable{Float64, 2}, OrdinaryDiffEq.Tsit5ConstantCache{Float64, Float64}}, OrdinaryDiffEq.DEOptions{Float64, Float64, Float64, Float64, OrdinaryDiffEq.PIController{Rational{Int64}}, typeof(DiffEqBase.ODE_DEFAULT_NORM), typeof(LinearAlgebra.opnorm), Nothing, DiffEqBase.CallbackSet{Tuple{}, Tuple{}}, typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), DataStructures.BinaryMinHeap{Float64}, DataStructures.BinaryMinHeap{Float64}, Nothing, Nothing, Int64, Tuple{}, Float64, Tuple{}}, AMPS.vars.variable{Float64, 2}, Float64, Nothing, OrdinaryDiffEq.DefaultInit}, force_save::Bool, reduce_size::Bool)
@ OrdinaryDiffEq ~/.julia/packages/OrdinaryDiffEq/AgIKX/src/integrators/integrator_utils.jl:64
Looks like it tries to push! a matrix instead of my defined struct. Anyone have an idea why this might be happening? I guess that the solver at some point interprets my array type as a standard matrix, but why would this happen only when I specify saveat?