I have an open mind here; but I recently had a screw-up that made me wonder if the current behavior of `size`

is ideal. My screw-up was that I had called `size(v, 2)`

where `v`

is a `Vector`

, when I should have been calling `size(v,1)`

or `length(v)`

. It took me a while to catch the error partially because I would have thought that `size(v::Vector, 2)`

would throw an error.

I tend to think of rank-N tensors as having exactly N indices, which is a little different than having infinitely many indices with all the ones past a certain number having only 1 dimension, mainly because it is possible to have tensors with only 1 component that do *not* transform as a singlet under some groups (e.g. rescaling); though admittedly this is rather abstract and of course arrays often possess none of the qualities of a tensor.

The current behavior is reasonable, so I have no strong opinions, but I thought Iâ€™d throw this out there and see what everyone elseâ€™s opinion is. Note that at least in numpy there is no equivalent since you canâ€™t do `np.shape(A,1)`

.