I want to understand how in place functions are treatet when used like in this example
function g!(u)
u[:] .= 2*u[:]
end
u=[1]
julia > g!(u) .*(u .+ g!(u))
1-element Array{Int64,1}:
32
For my understanding, this is a completely awkward result. It seems, that the functions g are evaluated first, independent of associative/commutative properties. I’m the only one having problems with that or is this common knowledge/general rule
(I just asking, because it took weeks for me to recognize that such behavior might show up. )
EDIT: For all newcomers digging this out. Writing this as
function g!(u)
u .= 2 .* u
end
is allocation free and therefore much more performant as DNF pointed out:
Using prefix notation (and ignoring broadcasting for now) you have
*(g(u), +(u, g(u)))
so I wonder how you imagine that anything but the gs could be evaluated first — they are needed as inputs for both + and *.
I don’t know what you mean here. If you are thinking about precedence, that plays no role here as your expression is fully parenthesized.
Generally, it is best to be more careful about mutating functions, or at least signal it with a !. If you must use in-place operations, don’t mix them with other complex expressions. Julia is just doing what you asked for here.
Argument-mutating functions do what they say. Hence it is not possible to use them in these kinds of expressions. Period, I’m afraid. BTW: Your function should also indicate its argument-mutating property by its name, g!.
Actually it’s easy to imagine that everything is evaluated starting from the deepest nested brackets. Namely, first g(u), then +(u, g(u)), then another g(u), and then the final multiplication. I’m not saying this is better (or worse) than evaluating g(u) twice and only after that computing the + and * operations, just pointing at another option.
I think this is because Julia uses a single + call with multiple arguments here, eg
julia> ex = :(a + b + c)
:(a + b + c)
julia> ex.head
:call
julia> ex.args
4-element Array{Any,1}:
:+
:a
:b
:c
I said above, I think it is bad style to depend on these things. IMO having to think about precedence and evaluation order in mutating code is a code smell.
This is an aside to the main point of this discussion, but this implementation ruins the point of in-place mutation, which is normally avoiding allocations. The slicing causes allocations to happen. Observe:
I completely agree but let me explain why I brought this up
When i want to solve a DAE with DifferentialEquations.jl, I can do this either in from of an ODEProblem using a mass_matrix =M which solves M*du = f!(du,u).
Or, as i thought, I can translate it to an DAEProblem via 0 = f!(du,u) .- M*du . And this failed since i was not aware of this kind of interference. So I necessarily need to allocate at least one time more to rewrite my problem.
But in summary I have no questions left thanks a lot for your patience
Jea, next time i will spent more time in providing a 100% acceptable, performant and still minimal, example ( but true, the one i gave was on of the worst possible )
It wasn’t intended as a criticism—the example was fine for illustrating the question—but as information to help you improve your code. I assumed that you weren’t aware of this performance trap.