**TL;DR**

- Vector spaces are first class.
- Tensors are linear maps from a vector space to an underlying field t: V\to K
- Dual spaces are first class with
`α = Covector(V,[1,2,3])`

- Vectors are degree-1 tensors whose domain is the dual space, i.e. v: V^*\to K.
- Covectors are degree-1 tensors whose domain is the vector space, i.e. \alpha: V\to K.
- Partial evaluation of tensors is supported resulting in other tensors.
- No special indexing is required.

The second to last point is kind of neat. Given a tensor t\in V\otimes W for vector spaces V and W, we get 3 (5 really) maps for free:

- t: V^*\otimes W^*\to K,\quad\alpha\otimes\beta\mapsto t(\alpha\otimes\beta)\in K
- t: W^*\to V^*,\quad\beta\mapsto t(-,\beta)\in V^*
- t: V^*\to W^*,\quad\alpha\mapsto t(\alpha,-)\in W^*

```
julia> t(α⊗β)
500.0
julia> t(α,β)
500.0
julia> t(-,β)
3-element Tensor{Float64,1,V^*}:
30.0
70.0
110.0
julia> t(-,β) ∈ V
true
julia> t(α,-)
4-element Tensor{Float64,1,W^*}:
38.0
44.0
50.0
56.0
julia> t(α,-) ∈ W
true
julia> t(α,β) === t(-,β)(α) === t(α,-)(β)
true
```

**Longer version**

Hi everyone

Many of you will remember the epic “Taking vector transposes seriously #4774”.

That was a heartening discussion that shows the Julia community does take this stuff seriously. As a result, we have a pretty awesome `LinearAlgebra`

standard library.

However, as seriously as we took transposes, I still think we can do a little better. There is not a lot of room left in the design space to improve things because `LinearAlgebra`

is already very good, but I think there is still some room. In my mind, this was highlighted in the constructive (also epic) discussion that took place with the PR to `LinearAlgebra`

:

That discussion resulted in the creation of a new package: TensorCore.jl.

In an attempt to summarize my thoughts, I created the issue:

There was additional constructive discussions there. However, at some point, you just need to shut up and write some code. So that is what I did

The result is TensorAlgebra.jl.

I am watching

with keen interest. The timeline for something like that would be v2.0 at the earliest so I think that would be an opportune time to possibly make some improvements to `LinearAlgebra`

so that higher order tensor algorithms (of use in quantum computing, epidemiology, differential geometry, category theory, etc.) can be implemented more naturally.

Here is a quick walkthrough:

```
julia> V = VectorSpace(:V,Float64)
V
julia> v = Vector(V,[1,2,3])
3-element Tensor{Float64,1,V^*}:
1.0
2.0
3.0
julia> v ∈ V
true
julia> α = Covector(V,[1,2,3])
3-element Tensor{Float64,1,V}:
1.0
2.0
3.0
julia> α ∈ V^*
true
julia> α(v)
14.0
julia> v(α)
14.0
julia> TensorSpace(V,W)
V ⊗ W
julia> V⊗W
V ⊗ W
julia> t = Tensor((V⊗W)^*,[1 2 3 4;5 6 7 8;9 10 11 12])
3×4 Tensor{Float64,2,V^* ⊗ W^*}:
1.0 2.0 3.0 4.0
5.0 6.0 7.0 8.0
9.0 10.0 11.0 12.0
julia> t ∈ V⊗W
true
julia> α
3-element Tensor{Float64,1,V}:
1.0
2.0
3.0
julia> β = Covector(W,[1,2,3,4])
4-element Tensor{Float64,1,W}:
1.0
2.0
3.0
4.0
julia> α⊗β
3×4 TensorProduct{Float64,2,Tuple{Tensor{Float64,1,V},Tensor{Float64,1,W}}}:
1.0 2.0 3.0 4.0
2.0 4.0 6.0 8.0
3.0 6.0 9.0 12.0
julia> α⊗β ∈ (V⊗W)^*
true
julia> t(α⊗β)
500.0
julia> t(α,β)
500.0
julia> t(-,β)
3-element Tensor{Float64,1,V^*}:
30.0
70.0
110.0
julia> t(α,-)
4-element Tensor{Float64,1,W^*}:
38.0
44.0
50.0
56.0
julia> t(-,β) ∈ V
true
julia> t(α,-) ∈ W
true
julia> t(α,β) === t(-,β)(α) === t(α,-)(β)
true
julia> t[2,3]
7.0
julia> (α⊗β)[2,3]
6.0
```