I’m interested in the bistable Schlogl model, which, under resevoir assumptions on species A and B, can be represented inside of
schlogl = @reaction_network begin c1 * a/ V, 2*S --> 3*S c2 / V^2, 3*S --> 2*S c3 * b * V, ∅ --> S c4, S --> ∅ end c1 c2 c3 c4 a b V
I am picking the constants such that this problem is relativley metatstable, and I want to explore the equilibrium statistics of the problem, having run it until
Tmax, a sufficiently large time. Thus, I am inclined to try the following code:
V = 25 a = 1 b = 2 c1 = 3 * 2 c2 = 0.6 * 6 c3 = 0.25 c4 = 2.95 params = (c1, c2, c3, c4, a, b, V); Tmax = 10^3; schlogl = @reaction_network begin c1 * a/ V, 2*S --> 3*S c2 / V^2, 3*S --> 2*S c3 * b * V, ∅ --> S c4, S --> ∅ end c1 c2 c3 c4 a b V function run_schlogl(Sinit) dprob = DiscreteProblem(schlogl, Sinit, (0.0, Tmax), params); jprob = JumpProblem(dprob, Direct(), schlogl); jsol = solve(jprob, SSAStepper()); return jsol.u[end]; end
run_schlogl on a variety of input data a sufficient number of times to sample the distribution. Notice that I am only making use of the terminal value in order to have independent samples.
Now, the problem I am encountering is that even though I only need the state at
Tmax, it is recording many intermediate values, and I see that the memory usage of the problem scales linearly with
@btime run_schlogl() 4.768 ms (34676 allocations: 4.23 MiB)
Presumably, I could improve performance if I avoided all these allocation, but I do not see a way to do this;
saveat does not seem to work here.
I also notice there are a lot of type instabilities when I use
@code_warntype on this. Is that simply an issue with the larger package, or are there ways to clean that up?