Gillespie in parallel on the GPU with DiffEqGPU?

DifferentialEquations is getting support for integrating ODE parameters ensembles on the GPU, in parallel. Another example for an ‘embarassingly parallel’ problem is stochastic simulation with multiple copies of random numbers, to be averaged over later.

Is it feasible to do a Gillespie simulation on the GPU in Julia? What has to be done to get independent parallel streams of random numbers on the GPU? Does/will DiffEqGPU support this?

I did just get callbacks working so it might work on v0.2.0 which was just released today. Give it a try? I have no idea haha.

If it’s pure Gillespie, not Gillespie mixed with ODEs, then we might want to build some optimal method. It might take some Cassette inlining magic.