So, I just found that my Julia code for computing the embedded back reduced density matrix has some issues. I have a function written in Mathematica that basically does what I want correctly but I want to work in Julia so I basically “translated” it into Julia code. I think the issue is associated with Julia being column-major while Mathematica is row-major. This is the function I have written at the moment but it is not even returning a Hermitian matrix
using TensorCore, Yao, QuantumInformation, LinearAlgebra
function eye(n::Int)
m = zeros(Float64, n, n)
for i in 1:n
m[i, i] = 1.0
end
return m
end
function embed_rdmat(rho_init, N::Int, rA::Vector{Int})
sizeA = length(rA);
rB = setdiff(collect(1:N), rA);
sizeB = length(rB)
rho_tensor_A = reshape(rho_init, Tuple(map(x->2, collect(1:2*sizeA))))
id_tensor_B = reshape(eye(2^sizeB), Tuple(map(x->2, collect(1:2*sizeB))))
rho = reshape(tensor(rho_tensor_A ,id_tensor_B), Tuple(map(x->2, 1:2*N)))
indices_to_permute = zeros(Int, 2, N)
indices_to_permute[1, :] = union(rA, rB)
indices_to_permute[2, :] = union(rA, rB) .+ N
rho = reshape(permutedims(rho, reduce(hcat, indices_to_permute)[1,:]), (2^N, 2^N))
return rho
end
N = 6;
ψ = rand_state(N);
rA = [1, 3]
rB = [2, 4, 5, 6]
ψvec = ψ |> state;
rhoA = QuantumInformation.ptrace(ψvec*ψvec', map(x->2, collect(1:N)), rB);
rhoA_fullH = embed_rdmat(rhoA, N, rA)
ishermitian(rhoA_full)
Any suggestion on how to tackle this issue is more than welcome!