I have here a simple snippet of an example function that works just fine, i.e it returns what I expect it to return when i call it with a 2D argument x and a 1D argument p.
@. function dummy(x,p)
3.0*x[:,2] +p[1] +x[:,1]*p[2]
end
If instead i say (for example because i want to use the result of this calculation in the same function for further calculations)
@. function dummy(x,p)
w= 3.0*x[:,2] +p[1] +x[:,1]*p[2]
return w
end
then I get an “UndefVarError: w not defined”. so far so good.
however, i have tried all sorts of ways that I can think of defining w (as a local array of appropriate shape, i.e. an array of the length of the first dimension of x) but the error message remains always the same. even if i try to define w as an obviously wrong type, the error message is “not defined” as opposed to some form of “nonsense defined”
how do i define a local w correctly in julia for use in a @. function?
You don’t need to apply @. to the entire function (and I wouldn’t recommend doing so, for exactly the reason you’ve run into here). The problem is that @. turns your = into .=, which in turn transforms w = ... into w .= ... which tries to broadcast the result into the (non-existent) array w.
Instead, just do:
function dummy(x, p)
w = @.(3.0*x[:,2] +p[1] +x[:,1]*p[2])
return w
end
Also, please quote your code so that it will render correctly.