+= operator

Is the += operator used as in:

x = 5.
x += 3.

equivalent to

x .= x + 3.


x = x + 3

According to the documentation, it is equivalent to the latter (Mathematical Operations and Elementary Functions · The Julia Language), at least in Julia 1.8.5


It’s the latter, as the documentation describes.

1 Like

That is very surprising because in languages such as C++, there is now overwriting. What is the rationale for this choice for the operator? What have such an operator? Thanks.


See Updating Operators in the manual

and Vectorized Dot Operators

1 Like

Different languages make different choices - in julia, += (that is, assignments) are not an operator. They don’t operate on anything, the are expressions. It’s just a difference in language design.


There is also

x .+= 3
1 Like

This is only for mutable containers like arrays, since it is equivalent to a broadcast! call. It will give an error if x refers to a scalar (which immutable) as in the examples above.

I’m confused about what behavior you expect here.

x = 5.0
x += 3.0

does exactly the same thing in Julia as it does in C++ for double x variable. (In both cases, the value of x is now 8.0.)

1 Like

Yes, but is the address (memory location) of the first x=5, the same as the memory address of x in the second line x += 3.0. I believe in C++ that it is. The += operator is in place. From the earlier messages in this thread, julia does not update a scalar x in place.

The language standards say nothing about the memory addresses of local variables (unless you specifically compute &x in C++, or if it is volatile), AFAIK. But in both C++ and Julia the compiler almost certainly updates x in-place with the new value. In fact, it probably puts x in a register — it may have no “memory address”!

(Conversely, the “memory location” of a local variable can change over time, e.g. if there is a register spill, and variables can even be optimized completely out of the program. The only time a value should stay in one spot is if you explicitly store it at a memory location, e.g. by putting it in an array or a mutable struct, or if you request a pointer to it in C/C++. And in that case of course you can update that location in-place, e.g. array[i] += 5.0 updates in-place in Julia.)


For things like scalars, does it really matter if it is the same memory address or not? In numerical computing, I can’t think of any examples why it would.

I think @erlebach is worried more about efficiency? But that is confusing language semantics for compiler optimizations. Even though semantically x = 5.0 and x += 3.0 refer to “different objects” (since you aren’t mutating the value of 5.0!) in Julia and in some sense in C++, in practice the compiler will almost certainly store the new value in the same location (e.g. a register, on the stack, whatever).

(Fun fact: in ancient times, with Fortran, you could sometimes accidentally “change the value of 5.0”.)

In Julia, you can quite easily inspect the compiler output for yourself. For example:

julia> function f(y)
          x = y
          x += 3.0
          return x

julia> @code_native debuginfo=:none f(5.0).   # (output edited to remove some metadata)
	fmov	d1, #3.00000000
	fadd	d0, d0, d1

In the compiled version of this function, there are no memory addresses! The input is passed in a register d0, to which 3.0 is added “in-place” with an fadd instruction and returned.