Automatic differentiation with density matrix in yao

Hey there,
i am currently implementing a quantum variational algorithm, where my goal is to minimize a cost function, where i use a gradient based optimzer. Since i have no analytically closed form of my gradient, i am currently stuck.

I know one can use automatic differentiation when one has a pure state by
simply using the function expect’( op::AbstractBlock, reg).

Now my question: Is there a similar function for mixed states, where i have a density matrix, so i could simply use expect’(op::AbstractBlock, density_matrix)?

If that’s not the case: Do you have any idea how i could get the gradient of an expectation value in a simple fashion when my system is described by a density matrix in yao?

My expectation value of an Operator O is Tr[U+ O U rho] where the + stands for the hermitian conjugate and rho is the density matrix. The unitary time evolution operator U depends on a set of variational parameters. The gradient is calculated in respect to those parameters.

I think that’s a question for the developers of Yao.jl, ping @Roger-luo and @1115

Thanks a lot for your reply, I’m not entirely sure, what you mean by pinging them.

I just did by tagging them on this post :wink: if they don’t answer here, perhaps opening a GitHub issue would be more efficient

Oh :joy: :sweat_smile:, appreciate your help!

I think this feature is not available yet in Yao. Yao has a naive AD engine that only supports differentiating reversible gates, however density matrices simulation is usually not reversible and requires checkpointing.

Currently, I am focusing more on simulating quantum circuits by contracting tensor networks. The contraction is backended by OMEinsum, which has good support to automatic differentiation.

I am happy to help convert your simulation to tensor networks for better automatic differentiation if you could provide a minimum working example.

Reference:

1 Like