Hello,
I am trying to perform Bayesian analysis on a Turing model with an ecological ODE system embedded; consider it to be a complex variant of a Lotka-Volterra model. As of now, the code is running perfectly fine without errors, but it is taking multiple days for finishing the sampling. Inference from 7000 draws for a model with 10 parameters is taking close to 3 days. So, I am planning to implement it on GPUs.
Can someone help me as to how to code appropriately? I couldn’t find any tutorials regarding how to perform inference using GPUs. Are there any tutorials available online. Following are some details:
- I use the default
NUTS
sampler andAutoVern7(Rodas5())
ODE solver. - None of the functions I use are custom-made. For instance, no custom prior distributions, no custom likelihood functions, etc.
- Following are the packages I need to use:
using DifferentialEquations, Interpolations, XLSX, DataFrames
using Distributions, MCMCChains, Turing
using ForwardDiff, Preferences
I couldn’t find any easy tutorials or documentation as to how to go forward with Turing
. Can someone please help me?
I can share more details of the code if needed and also the implementation process. To reiterate, current code using CPU cores is running fine and the only drawback is that it takes a lot of time for execution.
Thanks again for the help.