Slices: should they default to views?

I wonder what is more common, slicing a matrix to create a copy of the entries, or to create a view? If we desire the latter, we have to do @views. But if the latter is the more common use, wouldn’t it be nice to dispense with the need for an explicit macro?

Also, it seems that the expectation among new-comers to Julia is the latter. A case in point: if I come from Fortran, I expect slicing to be without a copy penalty (storage and speed).

The consequences of these incorrect assumptions are often corrected in this forum in order to re-gain performance. It might be good to match these expectations rather than to kick off a re-education process, wouldn’t you agree?

3 Likes

This has been discussed so many times already… please let’s not rehash old topics :weary: In particular I think this comment is most relevant. Also Ref. Array slices as views - what's the state of affairs?

The only thing that has changed since then is that views are stack allocated now most of the time. On the other hand, we have A LOT of code now that assumes this copies. So let’s either hold off on this discussion until 2.0, when it’s even possible to change that (though I have my doubts that this would change, either way).

Yeah. The problem is that in the 0.6/0.7 Era, the compiler want as good so it wasn’t as clear.

The topic is older than that - at the very least, 0.4 from what I could find. 0.6/0.7 was just the last time this was considered in any capacity, because that was the last breaking release before 1.0. No matter what, this change cannot happen before 2.0, which is nowhere near on the horizon from what I can tell.

I thought an argument for not providing views when slicing was safety: you can accidentally overwrite your data if you aren’t aware/careful. Having to use view/@view is a way to opt into the unsafe mechanism, much like @inbounds is opt-in. But maybe I completely made up this argument?

7 Likes

You may be right. Fortran has mechanisms for enforcing the semantics assumed in the callee (intent in, out, inout).

1 Like

This is good.

Sorry about the rehash. But it is not straightforward to find out about the history of something, especially going back several years.

1 Like

Yeah, as a side note I must say that’s a thing I like about fortran: we use the convention of the ! to signal mutation, but it isn’t clear what is being mutated: it’s usually the first argument, but not always, in some cases it’s the second, or the first two, or something else, or just none (or not even using ! when mutation does happen in some of the arguments). The intent in fortran makes it clear, but I don’t think we can directly translate that to Julia, not all arguments are in principle mutable.

1 Like

It’s alright, happens from time to time. I myself only searched for “view as slice” here on this very forum, found the discourse thread I linked and then through that found the issue where this was discussed.

Julia doesn’t seem to do that weird NumPy thing where indices sometimes make a view and sometimes make a copy (if indices are irregular), so I think making x[i] syntax do views by default in v2 would be fine, especially for consistency with left-side (set)indexing writing to a view in a way. Maybe merge getindex with copy and deepcopy.

On the other hand, I wonder how much we’re taking things for granted. Maybe a switch would just turn complaints about copy-as-slice ruining performance with repeated or large allocations into complaints about views-as-slice ruining performance with poorly cached accesses and impossible SIMD. How could we know which complaints are worse?

I think you are right, copying data to coallesce memory accesses and to enable SIMD ops may well be the right thing to do sometimes.

1 Like

People have also discussed that making broadcasting work nicely over slices (this is already possible via arcane syntax like getindex.(Ref(A),inds) instead of A[inds]) could alleviate the concern for the difference in many places. I don’t remember where this discussion took place, however, and don’t think it’s very close to being realized.

2 Likes