I am making doctests for a package (Catalyst.jl) using Documenter. There are many blocks that returns a big mess of output (not displayed), which is stochastic or pseudo-stochastic. An example would be
using DiffEqJump rn = @reaction_network begin p, ∅ → X d, X → ∅ end p d p = [1.0,2.0] u0 =  tspan = (0.,1.) discrete_prob = DiscreteProblem(rn, u0, tspan, p) jump_prob = JumpProblem(rn, discrete_prob, Direct()) sol = solve(jump_prob, SSAStepper())
which returns a big stochastic simulation object, with loads of random stuff.
(I would capsule this in
It would not be possible to test this using just an output, the alternative (which is suggested in the docs) is using regular expression. However, constructing these for all the various output boxes (and there will be many) will be loads of work, especially considering that I have 0 experience in using regex. Basically, it is not a plausible alternative.
Ideally, I would just want to test whenever the code returns an error (in which case there is a problem), if it doesn’t it should be fine (and there wouldn’t likely be any improvement to this binary check if I used regex). Is there some simple way of just OK’ing any output from a
jldoctest block, as long as the code does not throw an error?