Thanks a lot about your explanation. What I am still struggling to understand, is how the inclusion of sum type in the language, would automatically lead to Base being required to handle the nothing case of findfirst.
It is, what I understood would happen, and I don’t quite get, how these two things are related?
Why would we be unable, to implement that language feature without touching that?
I wonder why everyone here seems to see Julia as a dynamic language, while it seems to me, that it is designed to be dynamic and statically typed, depending on the requirement, situation, and preference.
It seems, to me, a lost opportunity to present this rare language feature, that is, for whatever reason, not very widely available. Julia being effectively a gradually typed language seems to be one of its core appeals to me.
And that would also showcase the usability and practicality of sum types to – as some of you have said – complement Julia’s approach to the expression problem.
If we added SumTypes to Base, we wouldn’t be able to do much with them within the language without breaking changes. It doesn’t matter how many people use findfirst. All that matters is that, right now, the docs say that it will return a valid index or nothing. So if you change it to return SumType{Int,Nothing} or whatever, that would be a breaking change.
That’s true for basically ALL public api right now. So you couldn’t implement it in Base, it would just be a tool for User Land… but if that’s the case, why not keep it a package where it can iterate faster and be less concerned with breaking changes while we (… er… @Mason ) work out how sum types should be implemented.
Probably because its semantics is mostly dynamic, i.e., types are not checked at compile time and dispatch is based on runtime type. Further, types are attached to values and not expressions. It gets a hybrid feel though due to its performance optimization to dispatch statically if all types can be inferred at compile-time. This is admittedly very clever as it keeps the language semantics dynamic while running at the speed of compiled ones.
Imho, sum types are most useful if checked at compile-time – which Julia doesn’t do. Otherwise, untagged unions are just as fast when being constant propagated – which Julia does. What other use cases would you have in mind, where the following would be needed or sufficiently better:
d = Dict(:a => 1, :b => 2)
# Untagged union
dynget(d, k) = get(d, k, missing)
let v = dynget(d, :c)
if ismissing(v)
"handle failure here"
else
"do something with $v here"
end
end
@show Base.return_types(dynget, (typeof(d), typeof(:a)))
# Tagged union
abstract type Option{T} end
struct NoVal{T} <: Option{T} # Note that NoVal is still typed!
end
struct Val{T} <: Option{T}
val::T
end
function safeget(d::Dict{K,V}, k::K) where {K,V}
if haskey(d, k)
Val{V}(d[k])
else
NoVal{V}()
end
end
let v = safeget(d, :c)
if v isa NoVal
"handle failure"
elseif v isa Val
"do something with $(v.val)"
end
end
JET.@report_call detects inexhaustive @cases in SumTypes.jl.
julia> using SumTypes
julia> @sum_type Color begin
Red
Green
end
julia> f(x) = @cases x begin
Red => 1
Green => 2
end
f (generic function with 1 method)
julia> @report_call f(Red)
No errors detected
julia> g(x) = @cases x begin
Red => 1
end
g (generic function with 1 method)
julia> @report_call g(Red)
═════ 1 possible error found ═════
...
││ Inexhaustive @cases specification. Got cases (:Red,), expected (:Red, :Green): SumTypes.assert_exhaustive(::Type{Val{(:Red, :Green)}}, ::Type{Val{(:Red,)}})
│└────────────────────
Nice, thanks. That makes them more useful indeed.
Another use case that just came to my mind, would be a sum type for the cases of an Expr. Might make it easier to cover all cases when writing complex macros (which is a bit annoying without something like pattern matching or MacroTools’s @capture anyways).
I’m not really seeing how sum types solve the expression problem. Functional languages are good at adding new operations to existing types and bad at applying existing operations to new types. How do sum types help functional languages apply existing operations to new types? The classic solution to the expression problem in Haskell, as I understand it, is type classes, not sum types.
I’d pretty strongly disagree with that. They’re really not relevant to solutions to the expression problem because they’re closed by design. There’s nothing you can do with them that can’t be done with relatively simple if-else statements. The benefit of sum types is in uniformity, ergonomics and efficiency. Not extensibility or modularity. They’re explicitly non-extensible.
True. And type classes ‘open up’ the sum types, so they themselves are not expanded, but they are still used underneath.
And as you see in the F# example, are there also other ways, to solve that closed nature of sum types.
Although I still prefer the multiple dispatch approach, for its simplicity.
It seems like one of the more elegant ways, to solve that expression problem.
Most of the things I’ve seen written about the expression problem in the context of Haskell actually use the closed nature of sum types as an illustration of the problem, not its solution. Type classes are usually suggested as the solution, as has already been mentioned here. Otherwise, I think you need to change sum types to something like OCaml’s polymorphic variants to directly address the expression problem. (Or so I think from what I’ve read. I have used Haskell quite a bit, but I don’t have any real experience with OCaml).