I am trying to achieve better performance in my Matrix Inversion by tagging the Matrix as ‘symmetric’.
The Matrix is medium sized (200x200) Parameters and complex valued (it is indeed symmetric and not hermitian). The norm of the outputs in the tagged and untagged case are wildly different; Only the untagged one gives the correct result. Also, the matrix is regularized, i.e. the inverse exists.
I seem to get the same problem for real matrices as well. I’m on Julia 1.1.
What is the proper way to tell the inv() function that the input is symmetric?
Is it possible you meant A = randn(n,n) + 1im * randn(n,n)
? Otherwise randn(n)
creates a length-n vector, and then you’re taking its outer product, creating a singular matrix.
Sorry, you’re right of course, however the question remains. The example was not applicable for my problem, I took it out.
I increased the regularization and now norm(inv(A) - inv(Symmetric(A)))
goes to zero (for higher Regularization). Clearly, this is not desired as higher regularization means more imprecise results. Is there a way to speed up matrix inversions of close to singular, symmetric matrices?
This looks like a problem with conditioning. Are
inv(A)
and
inv(Symmetric(A))
using the same algorithm to compute inv? If not, poor conditioning could lead to different results. This would explain your observations as the regularization increases.