I’m getting “ArgumentError: matrix is not symmetric/Hermitian.” on a call MvNormal(x[r],Matrix(Hermitian((noiseMatrix(m)+fuzz)[r,r])))
. How do I avoid this if Matrix(Hermitian(...))
doesn’t? Note, both noiseMatrix(m) and fuzz are created using quadratic forms, so the only non-Hermitian nature they have should be from rounding error. I just want to ignore it and move on. How do I do this?
Note, this is on 0.6. If updating to 1.x would help, let me know.
Thanks,
Jameson
This works on my machine:
julia> using Distributions
julia> MvNormal(rand(3), Matrix(Hermitian(rand(3,3) + I)))
FullNormal(
dim: 3
μ: [0.652153, 0.850454, 0.706293]
Σ: [1.25598 0.361987 0.768328; 0.361987 1.75802 0.954967; 0.768328 0.954967 1.52256]
)
You shouldn’t have to convert back to a Matrix
— Hermitian
is a special matrix type that tells linear-algebra functions to take advantage of the Hermitian property if they can.
2 Likes
It can sometimes backfire in speed though at least in multiplication.
Take a bigger matrix and chances are higher that you get rounding errors (with 6x6 you already have something like 50% :
julia> MvNormal(rand(6), Matrix(Hermitian(rand(6,6) + I)))
FullNormal(
dim: 6
μ: [0.209424, 0.0390436, 0.944169, 0.669357, 0.624151, 0.0911069]
Σ: [1.59011 0.289772 … 0.244625 0.620135; 0.289772 1.2934 … 0.219601 0.932076; … ; 0.244625 0.219601 … 1.73461 0.12922; 0.620135 0.932076 … 0.12922 1.24544]
)
julia> MvNormal(rand(6), Matrix(Hermitian(rand(6,6) + I)))
ERROR: PosDefException: matrix is not positive definite; Cholesky factorization failed.
Stacktrace:
[1] checkpositivedefinite at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/LinearAlgebra/src/factorization.jl:11 [inlined]
[2] #cholesky!#90(::Bool, ::Function, ::Hermitian{Float64,Array{Float64,2}}, ::Val{false}) at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/LinearAlgebra/src/cholesky.jl:153
[3] #cholesky! at ./none:0 [inlined]
[4] #cholesky!#91(::Bool, ::Function, ::Array{Float64,2}, ::Val{false}) at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/LinearAlgebra/src/cholesky.jl:185
[5] #cholesky! at ./none:0 [inlined] (repeats 2 times)
[6] #cholesky#95 at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/LinearAlgebra/src/cholesky.jl:275 [inlined]
[7] cholesky at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/LinearAlgebra/src/cholesky.jl:275 [inlined] (repeats 2 times)
[8] Type at /Users/tamasgal/.julia/packages/PDMats/Kouno/src/pdmat.jl:19 [inlined]
[9] MvNormal(::Array{Float64,1}, ::Array{Float64,2}) at /Users/tamasgal/.julia/packages/Distributions/WHjOk/src/multivariate/mvnormal.jl:208
[10] top-level scope at none:0
In your example, I think it is lack of positive definiteness rather than rounding errors that is causing the problems.
More generally, while working with the covariance matrix is traditional analytically, numerical calculations almost always use a factorization of it, commonly Cholesky.
I think that Distributions.MvNormal
’s interface is less than ideal, so I am experimenting with a WIP alternative, which I use for Bayesian inference.
1 Like
Nice tip: a symmetric strongly diagonally dominant matrix is positive definite
julia> using LinearAlgebra
julia> isposdef(Matrix(Hermitian(rand(6,6) + 5I)))
true
4 Likes