Full code here
(GPTool) pkg> st
Project GPTool v0.1.0
Status `~/repos/gptool/Project.toml`
[99985d1d] AbstractGPs v0.2.6
[c7e460c6] ArgParse v1.1.0
[336ed68f] CSV v0.7.7
[324d7699] CategoricalArrays v0.8.1
[a93c6f00] DataFrames v0.21.7
[b4f34e82] Distances v0.9.0
[31c24e10] Distributions v0.23.10
[91a5bcdd] Plots v1.6.1
[fce5fe82] Turing v0.14.1
[37e2e46d] LinearAlgebra
[56ddb016] Logging
[de0858da] Printf
[10745b16] Statistics
using CSV
using DataFrames
using Turing
using Distributions
using Plots
using Distances
using LinearAlgebra
using AbstractGPs
df = DataFrame(
subject = repeat(1:5, inner=3),
obese = repeat(rand(Bool, 5), inner=3),
timepoint = [1,2,3,1,3,4,1,2,5,1,4,5,1,3,5],
bug = rand(Beta(0.9, 5), 15),
nutrient = rand(Beta(0.9,5), 15)
)
## Straight Turing
sqexp_cov_fn(D, phi, eps=1e-3) = exp.(-D^2 / phi) + LinearAlgebra.I * eps
@model function myGP(y, Z, X, m=0, s=1, s_beta=3, cov_fn=sqexp_cov_fn)
# Dimensions of GP predictors. Note that if X is a single column
# it needs to have dimension Nx1 (a matrix), not an array of
# length N, so that it can be processed properly by the distance function.
N, P = size(X)
# Dimensions of linear model predictors
J = size(Z, 2) # Z should be N x J
# Distance matrix.
D = pairwise(Distances.Euclidean(), X, dims=1)
# Priors.
mu ~ Normal(m, s)
sig2 ~ LogNormal(0, 1)
phi ~ LogNormal(0, 1)
# Realized covariance function
K = cov_fn(D, phi)
# Prior for linear model coefficients
beta ~ filldist(Normal(0, s_beta), J)
# Sampling Distribution.
y ~ MvNormal(mu * ones(N) + Z * beta, K + sig2 * LinearAlgebra.I(N))
end
gp = myGP(df.bug, Matrix(df[!,[:subject, :obese, :nutrient]]), Matrix(df[!,[:timepoint]]))
chain = sample(gp, HMC(0.01, 100), 200)
Tried both, on their own and together, same deal :-/
Separately, I’m trying to follow this example and running into other issues. Is there a better place to raise those?