Dear Julia community,
I am new to julia and try to implement a complex bayesian inference procedure on a ODE system, using turing.
I base my code on the great turing tutorial: Bayesian Estimation of Differential Equations
While the simple case works, I would like to extend this tutorial to two more realistic problems for my research.
- Let’s say I have a Lotka-Volterra model with X and Y the variables. In this case, X and Y can be different by several magnitudes and then I want to log-transform my data and my predicted ode solution when estimating the likelihood.
From the tutorial example, I tried the following code:
Turing.setadbackend(:forwarddiff) @model function fitlv(data, prob1) σ ~ InverseGamma(2, 3) # ~ is the tilde character α ~ truncated(Normal(1.5,0.5),0.5,2.5) β ~ truncated(Normal(1.2,0.5),0,2) γ ~ truncated(Normal(3.0,0.5),1,4) δ ~ truncated(Normal(1.0,0.5),0,2) p = [α,β,γ,δ] prob = remake(prob1, p=p) predicted = solve(prob,Tsit5(),saveat=0.1) temp = predicted .+ 1 .- minimum(predicted) for i = 1:length(predicted) log_predicted = log10.(temp[i]) data[:,i] ~ MvNormal(log_predicted, σ) end end model = fitlv(log10.(odedata .+ 1 .- minimum(odedata)), prob1) chain = mapreduce(c -> sample(model, NUTS(.65),1000), chainscat, 1:3)
But I got some problems because given the set of parameters,
predicted can be badly estimated by the ode solver and be <0, then log(predicted) can give a complex number and then the sampling stop … do you have any insights on what would be the best option to implement this?
- Now let’s say that X and Y data are not homogeneous and doesn’t have the same time vector and size. How do you implement this in turing? Espcially when estimating the likelihood because if X has 10x more data points than Y then the fit will be weighted on the X channel.
Many thanks for all your help!