Choleski Factorization transpose



I have a Matlab code like this:

    T = 0.5;
L = chol([DeltaT,rho*DeltaT;rho*DeltaT,DeltaT])’;

I tried to translate it into Julia code:

    T = 0.5
L = cholfact([DeltaT rho*DeltaT;rho*DeltaT DeltaT])’

but got an error message

ctranspose not implemented for Base.LinAlg.Cholesky{Float64,Array{Float64,2}}



L = chol([DeltaT rho*DeltaT;rho*DeltaT DeltaT])’
L = cholfact([DeltaT rho*DeltaT;rho*DeltaT DeltaT], :L)


Thanks so much!


The latter returns a cholesky factor object (which is not a subtype of abstract array), instead of a lower triangular matrix. You can extract it with

L = cholfact([DeltaT rho*DeltaT;rho*DeltaT DeltaT], :L)
Lmatrix = L[:L]

Alternatively, these work:

L = cholfact([DeltaT rho*DeltaT;rho*DeltaT DeltaT])[:L]
L = cholfact([DeltaT rho*DeltaT;rho*DeltaT DeltaT], :L)[:L]

If performance matters, go with the latter of these two. The former computes the upper triangular, and then transposes it while the latter gets the lower triangular directly.

For a 4000x4000 matrix, this makes the difference between 262/315 and 185/187 ms minimum/median on my computer.
Extra allocations of non-lazy tranpose on 0.6.2 is why the minimum/median gap is so big for calculating the upper triangle and transposing.

On 0.7 it was 270/271 and 203/204, respectively.
Syntax was updated to cholfact(Hermitian(X, :L)).L.
Memory requirements did not go down because .L still transposed the data when using cholfact(Hermitian(X)).L.