A Qubit DSL?


One thing I’m particularly interested in is calculating the entanglement entropy for various lattice models. This gives some low hanging fruit to start out with recreating results in the literature but allows one to branch out pretty fast.

Having originally come from Loop Quantum Gravity, I feel your pain. Though to be fair, its not actually obvious to me that there isn’t some transformation group which neural networks are covariant under so it may be that these arrays actually are tensorial with respect to some information theoretic transformation. But that sounds like a non-trivial statement to explore.


Yep, it doesn’t parse:

Meta.parse("[x]⟩") # ParseError("invalid character \"⟩\"")

IMHO QuDirac strings are better than any proposal I’ve red here, you may dislike wrapping in strings, but that is already doable, and this is getting more complex by the day, If you don’t like to input the strings, you could add a REPL mode to QuDirac, that takes strings, parses them as dirac strings and returns the result, this way instead of:

julia> bell = d" 1/√2 * (| 0,0 > + | 1,1 >) "
Ket{KroneckerDelta,2,Float64} with 2 state(s):
  0.7071067811865475 | 0,0 ⟩
  0.7071067811865475 | 1,1 ⟩

It could be made to look inda like this:

dirac> bell = 1/√2 * (| 0,0 > + | 1,1 >) 
Ket{KroneckerDelta,2,Float64} with 2 state(s):
  0.7071067811865475 | 0,0 ⟩
  0.7071067811865475 | 1,1 ⟩


That does sound like a good starting point. Maybe today if I’ll have time I’ll dig back through the relevant literature and see if I can get a handle on what’s involved.

Well that may be, but the fact that in practice what people do is just take about any data they can find (which can be just about anything in any form) and mangle it so that they can stuff it into the input of a neural network seems like a pretty strong sign that there aren’t really any deep group theoretic ideas at play there. Certainly there could be in principle, and you could be really silly and say "Well, I can just take this column and apply SO(N) to it, so I’m just going to say it’s SO(N)" but at best it completely violates the “spirit” of the word tensor. Besides, it turns out that the fundamental objects are the layers which are to be considered some sort of “auxiliary” random variable (individual weights are meaningless).

Sorry, I’m causing this thread to get badly off-topic.


In case anyone’s still thinking about this, an alternative to string macros or modifying Julia’s parser is a repl mode.

using ReplMaker

const ketpat = r"\|.*?\>"
ketrep(str) = "Ket("*(match(r"(?<=\|).*?(?=>)", str).match)*")"

const brapat = r"\<.*?\|"
brarep(str) = "Bra("*(match(r"(?<=<).*?(?=\|)", str).match)*")"

function rep_braket(str)
    replace(replace(str, brapat => brarep), ketpat => ketrep)

parse_braket(str) = Meta.parse(rep_braket(str))

julia> initrepl(parse_braket, 
                prompt_text="BraKet> ",
                prompt_color = :red, 
REPL mode BraKet-Mode initialized. Press > to enter and backspace to exit.

Now if you define what Bra and Ket mean, ie.

Ket(a::Vector) = a
Bra(a::Vector) = a'

you can have all sorts of fun

↑ = [1, 0]
↓ = [0, 1]

BraKet> σz = |↑>*<↑| - |↓>*<↓|;
BraKet> σx = |↑>*<↓| + |↓>*<↑|;
BraKet> σy = -im*|↑>*<↓| + im*|↓>*<↑|;

BraKet> σx*|↑>
2-element Array{Int64,1}:

BraKet> σx*|↑> == |↓>

BraKet> σz*|↓> == -|↓>  


FYI, we are planning to compile the block tree in Yao.jl#124, maybe you would be interested in this kind of DSL for quantum computation. We can definitely consider something like <| and |> as strings.

And it might be more elegant to have this in a special REPL mode as well (for the quantum registers).


Hi, I actually am making progress on the arbitrary dimension side of thing for fermion algebras.

There are still some more optimizations I need to make to the type system, but it already looking good, this Grassmann package will be able to do calculations with any sort of spin groups, Lie groups, fermions, mixed tensors, spacetime algebra \Lambda(V^n), etc. and I am using StaticArrays for it.

It handles up to 2^{9+2}=2048 dimensions right now, but it can go higher too.


I had meant to respond to this way back when you first posted it but forgot.

This looks very interesting! I look forward to taking a closer look at it. Indeed, one of the most challenging things about doing lattice gauge theory (at least in a somewhat general way) is that it tends to involve tensors which involve many indices of different types. For example, even the Wilson operator U_{\mu} contains not only the spacetime vector index and a pair of SU(N) indices, but in the discrete form of the problem even all spacetime dimensions are essentially additional indices (at least from a computational perspective). Grassman.jl looks like a good starting point for tackling something like that.


Indeed, SU(n) is Lie algebra, which can be represented with the Grassmann’s exterior product algebra, generating a 2^{2n} dimensional mother algebra with geometric product from the n-dimensional vector space and its dual vector space. The product of the vector basis and covector basis elements form the n^2-dimensional bivector subspace of the full \frac{(2n)!}{2(2n-2)!}-dimensional bivector sub-algebra, which is also a topological space with lots of group structure.

The package Grassmann is working towards making the full extent of this number system available in Julia by using static compiled parametric type information to handle sparse sub-algebras, such as the (1,1)-tensor bivector algebra needed for Lie groups (also conformal groups and projective geometry).

It can already handle 10,000,000 geometric computations per second on a single thread, but is still missing some core features and isn’t ready for official release yet. In my next update, I am specifically adding support for (1,1)-tensors from the tensor product.


The README has been updated, you can now use direct sum notation to construct the space

julia> using Grassmann; @mixedbasis ℝ⊕ℝ
(++--*, e, e₁, e₂, f¹, f², e₁₂, e₁f¹, e₁f², e₂f¹, e₂f², f¹², e₁₂f¹, ...)

julia> ℝ'⊕ℝ^3 # Minkowski spacetime

With that, you can now also construct a mixed (1,1)-tensor product grade 2 element as

julia> ℒ = ((e1+2*e2)∧(3*f1+4*f2))(2)
0e₁₂ + 3e₁f¹ + 4e₁f² + 6e₂f¹ + 8e₂f² + 0f¹²

which can also be used as a function accepting Grade{1} vectors as arguments

julia> ℒ(e1+e2)
7e₁ + 14e₂ + 0f¹ + 0f²

julia> [3 4; 6 8] * [1,1]
2-element Array{Int64,1}:

which is a computation equivalent to a matrix computation.

Unfortunately, I don’t have a fully general code transformation rule for generating the evaluation function for arbitrary tensor products with any number of indices, I am still working on generalizing and optimizing that.

This has been a really fun package to make so far, it is a bit abstract, but I would like to hear feedback.


By the way, @ExpandingMan the Grassmann.jl package now supports very high dimensions (basically arbitrarily high) and it now has the generalized inner products available also.


That’s looking more and more interesting, looking forward to digging into it!