# Turing: "no method matching cholenksy"

Good morning,

I am doing a Bayesian logistic regression with correlated logit regression co-efficients. Here is the model I am working with:

@model BayesLogisticRegression(y, X) = begin
N_Observations = size(X)
N_Features = size(X)

``````Sigma  ~ InverseWishart(N_Features+2, Matrix{Float32}(I, N_Features+1, N_Features+1))
Mu ~ MvNormal(zeros(N_Features+1), Matrix{Float32}(I, N_Features+1, N_Features+1) )
Coefficients ~ MvNormal(Mu, Sigma)

Logit = X[:, 1]*Coefficients .+ Coefficients

Probability = Sigmoid.(Logit)
for i = 1:N_Observations
y[i] ~ Bernoulli(Probability[i])
end
``````

end;

``````Model = BayesLogisticRegression(y, X)
Chain = sample(Model, NUTS(10, 100, 0.65))
``````

I am currently getting the error: "Method error: no method matching cholesky(::TrackedArray{…,Array{Float64, 2}}; check = false).
It highlights Sigma in the error. I used https://discourse.julialang.org/t/hyper-priors-in-turing/20752 as a template so Im not sure why I’m getting this error. Thank you!

Hi,
The following minor modification of your code runs fine on the current #master branch of Turing using Julia 1.1.0.

``````using Turing, FillArrays, LinearAlgebra

# Sigmoid function (numerically stable)
sigmoid(x::T) where {T<:Real} = x >= zero(x) ? inv(exp(-x) + one(x)) : exp(x) / (exp(x) + one(x))

@model BayesLogisticRegression(y, X) = begin
N,D = size(X)

Sigma ~ InverseWishart(D+2, Matrix{Float64}(I, D+1, D+1))
Mu ~ MvNormal(Fill(0.0, D+1), 1.0 )
Coefficients ~ MvNormal(Mu, Sigma)

Logit = X[:, 1]*Coefficients .+ Coefficients

Probability = sigmoid.(Logit)
for i = 1:N
y[i] ~ Bernoulli(Probability[i])
end
end

# initialize model with some dummy data
Model = BayesLogisticRegression(rand(0:1, 100), randn(100, 10));

# NUTS sampling for 10 iterations
sample(Model, NUTS(0.65), 10);
``````

However, if the initial values for `Sigma` are such that the matrix is not positive-definite & symmetric the sampling will break in the first iteration. Which means you have to be lucky or better set some reasonable initial values for `Sigma`.

Note: Unfortunately, we have to provid tailored gradients in Turing using our own DistributionsAD package for some distributions (including MvNormal) as not all distributions from the Distributions package allow for proper AD.

@cpfiffer is it possible to set initial values for `Sigma` using the new interface? If so, how would you do it?

– EDIT –
I seemed to have been lucky and cannot reproduce it anymore. I think you will have to set appropriate initial values.

@Kai_Xu Do you think we might have to change the initialisation procedure for such cases?

``````Stacktrace:
 checkpositivedefinite(::Int64) at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.1/LinearAlgebra/src/factorization.jl:11
 #cholesky!#97(::Bool, ::Function, ::Array{Float64,2}, ::Val{false}) at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.1/LinearAlgebra/src/cholesky.jl:182
 #cholesky#101 at ./none:0 [inlined]
 cholesky at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.1/LinearAlgebra/src/cholesky.jl:275 [inlined] (repeats 2 times)
 logpdf_with_trans(::InverseWishart{Float64,PDMats.PDMat{Float64,Array{Float64,2}}}, ::Array{Float64,2}, ::Bool) at /Users/martin/.julia/packages/Bijectors/hFDmJ/src/Bijectors.jl:365
 assume(::Turing.SampleFromUniform, ::InverseWishart{Float64,PDMats.PDMat{Float64,Array{Float64,2}}}, ::Turing.Core.RandomVariables.VarName{:Sigma}, ::Turing.Core.RandomVariables.VarInfo{Turing.Core.RandomVariables.Metadata{Dict{Turing.Core.RandomVariables.VarName,Int64},Array{Distribution,1},Array{Turing.Core.RandomVariables.VarName,1},Array{Real,1},Array{Set{Turing.Selector},1}},Real}) at /Users/martin/.julia/dev/Turing/src/inference/Inference.jl:519
``````

It is possible, but it’s super user-unfriendly. Basically you have to pass in all parameters in a single vector, so this model would require you to unwrap all the elements of `Sigma`, `Mu`, and `Coefficients`. I’m hoping sometime that we can overhaul this to allow the passing in of a `NamedTuple`, but for right now it’s really primitive.