Perform Bayesian estimations of ODE parameters when variables were not measured simultaneously

Greetings everyone!

Recently I need to use Bayesian method to estimate the parameters of a chemical kinetic model which can be described by ordinary differential equations. I found Turing.jl or DiffeqBayes.jl might be the correct way to do this job.

The kinetic model has four variables, however, they are not experimentally determined at the same time. For example, at time point ti, variable x1 and x3 have values but variable x2 and x4 only have missing data. In this case, how can I do Bayesian estimations of ODE parameters?

Just encode that in the loss function. saveat the union of times and then split the output.

This should work. To fit data, the loss function should be defined as the sum of weighted squared errors. The weighted is the reciprocal of the variance of the measurement.

But here is another question, I didn’t find the loss function in this example from https://turing.ml/v0.22/tutorials/10-bayesian-differential-equations/. If I want to use the loss function, how should I do it in Turing.jl?

@model function fitlv(data, prob)
    # Prior distributions.
    σ ~ InverseGamma(2, 3)
    α ~ truncated(Normal(1.5, 0.5); lower=0.5, upper=2.5)
    β ~ truncated(Normal(1.2, 0.5); lower=0, upper=2)
    γ ~ truncated(Normal(3.0, 0.5); lower=1, upper=4)
    δ ~ truncated(Normal(1.0, 0.5); lower=0, upper=2)

    # Simulate Lotka-Volterra model. 
    p = [α, β, γ, δ]
    predicted = solve(prob, Tsit5(); p=p, saveat=0.1)

    # Observations.
    for i in 1:length(predicted)
        data[:, i] ~ MvNormal(predicted[i], σ^2 * I)
    end

    return nothing
end

model = fitlv(odedata, prob)

# Sample 3 independent chains with forward-mode automatic differentiation (the default).
chain = sample(model, NUTS(0.65), MCMCSerial(), 1000, 3; progress=false)

Loss function and maximum likelihood observables are effectively the same thing. Just do like:

    for i in 1:length(predicted)
        if i in values_for_dataset1
          data1[i] ~ MvNormal(predicted[1:2, i], σ^2 * I)
        else
          data2[i] ~ MvNormal(predicted[3:4, i], σ^2 * I)
        end
    end

which would split a 4 observable system to make 1:2 to data1 and 3:4 to data2. Of course there are details to fill in there but that should be a sufficient hint.