Optional variable declarations

There’s a discussion on Reddit that has some interesting ideas. It prompted me to ask, whether it might be possible to add the possibility of declaring a variable before use. This might give the following advantages:

  • More correctness checking
var x::Int 
const var y::Int # May me assigned to only once
var z::Vector{Real}

x="test" # Error
x = 1
x = y # Error
y = 2
y = 3 # Error
z = 1.0 # Error
z = [1] # Error
  • Plausibly, it will assist the optimization, because compiler has the guarantee that the type will not change during the execution

  • It enables method dispatch on function return type

var x::Vector
var y::(Vector,Matrix)

x=eig(A)  # Computes only eigenvalues
y=eig(A) # Computes both eigenvalues and eigenvectors

Would be interested to hear from the language experts whether this may be possible.

Thanks

This I really don’t like because it results in unreadable code.

But I am not an expert, this is just something which springs into my mind. Looking forward to this discussion, it promisses to be interesting…

How about

(var y::(Vector,Matrix))=svd(A)

Or even

y = eig(A)::(Vector,Matrix)

That one actually doesn’t even need variable declarations.

Actually this I like as an optional thing (didn’t mention this above).
Your solutions to make the code more readable are all acceptable and it is as always that unreadable code is always possible I guess. Lets see what real experts have to say.

You can already use local x::Int in a function, does this cover your needs?
const is currently not supported, but I’m pretty sure there is a github issue about that.

2 Likes

No, it doesn’t


julia> function foo()
  local x::Int
  x = 1.0
end

julia> foo()

1.0

It should give error instead.

No this works, x is an Int inside the function, but you are returning the RHS of the assigmnent, i.e. the literal 1.0.
In x = 1.0, this internally calls convert(Int, 1.0) and assigns the result to x.

4 Likes

Hmm, I guess you’re right

julia> function foo()
         local x::Vector{Int}
         x = 1.0
         print(typeof(x))
         x
       end
foo (generic function with 1 method)

julia> foo()
ERROR: MethodError: Cannot `convert` an object of type Float64 to an object of type Array{Int32,1}

Documentation could be a bit more explicit about this, I think.

Now what about the other point - support for method dispatch on a function’s return type?

function foo()::Int = 1
function foo()::Float64 = 2.0

local x::Int
local y::Float64

x=foo() # x=1
y=foo() # y = 2.0

Or maybe,

x = foo()::Int 
y = foo()::Float64 

In other words, convert(Int,foo()) actually returns the appropriate method

I don’t think this can be made to work in Julia (or if it can, this would require a lot of work, but I can be wrong).

In the meantime, I suggest that you ask questions to solve concrete problems you encounter. Designing a language is hard and discussing these design questions for Julia requires at least a good familiarity with how the languages works now.

The main reason I asked about the variable declaration is to support
dynamic dispatch on the return type, so it is not a usage question.

Why can’t it be made to work in Julia, or require a lot of work? The syntax examples I gave would be translated to a function call where the return types are passed to a function as a hidden parameter. If only one method applies, this hidden parameter is ignored, so existing libraries need no to be rewritten.

My point was that this might not really be needed in practice, and there might be easy workarounds for specific cases where you feel you need it.

I will let others answer on the doability of return-type dispatch, I only know this kind of feature in compiled languages like Haskell (and maybe rust?) and just have a feeling it’s difficult and that it won’t happen in Julia so I won’t try to think more about it.

1 Like

It’s already common practice for functions to not have fixed types of return values, for example, iterators: “The return value from iterate is always either a tuple of a value and a state, or nothing if no elements remain.”

That is not a problem for dispatch on return types, as far as I can tell.

I am not sure I understand the problem you are trying to solve here, but if you are looking for “arrow types”, search for the term in

It is my understanding that they are explicitly unsupported as a design choice.

1 Like

I am not sure I understand the problem you are trying to solve here

An example I gave above is the problem that MATLAB solves by “nargout” function.

First, such a change would be breaking, so discussions like this are purely theoretical at this stage in Julia’s lifecycle. This does not rule them out, it’s just good to keep this in mind.

That said, the major difficulty I see with your proposal is that generally, there is no reliable way to tell if something is a “single value” or “multiple values”. Eg for

f(args...)::T

T may be a tuple type, but treated as a single value. Or an otherwise opaque type that is nevertheless iterable with iterate. With Julia’s rich typesystem, such distinctions are not always clear, and would require the compiler to guess programmer intent.

1 Like

I don’t think that it would break anything. Perhaps, you could give an example?

That argument also applies to a function’s arguments, so it proves too much. (in fact, the implementation would be passing output types as a hidden argument)

Since currently functions do not dispatch on it, anything resembling nargout in MATLAB would be breaking.

I am not sure I understand what you are saying here. Dispatch has well-specified semantics in Julia, and works on types. Whether those types are, semantically, for multiple or single values does not concern dispatch per se. It is just the types. Eg

f(::Foo) = 1

may be called regardless of whether I define the type as

struct Foo end # singleton

or

struct Foo{T} <: AbstractVector{T} # rather like a FillArray
    v::T
    n::Int
end
Base.size(f:::Foo) = (f.n, )
Base.getindex(f::Foo, i::Int) = f.v # no bounds check

It would work the same way for dispatch on the output types. I think you overlooked the fact that I am not advocating nargout keyword per se, but a way to implement an equivalent functionality.

Not at all. The existing method dispatch does not change. But if
I define a more specific method, then method dispatch calls the most specific method, just like it does now with respect to the input arguments.

function eigvals(args) :: NTuple{2, Array}
  values,vectors = eigen(args)
  return (values,vectors)
end

So in the existing code,

F = eigvals(A)

does the same thing as now, but

local F:: NTuple{2, Array}
F =  eigvals(A)

also computes the eigenvectors.