I was trying to overload the += operator, but I had this error instead:

ERROR: LoadError: syntax: invalid identifier name "+="
Stacktrace:
[1] include at .\boot.jl:317 [inlined]
[2] include_relative(::Module, ::String) at .\loading.jl:1044
[3] include(::Module, ::String) at .\sysimg.jl:29
[4] exec_options(::Base.JLOptions) at .\client.jl:231
[5] _start() at .\client.jl:425
in expression starting at C:\Users\User\Desktop\Tfach\julia_project\test.jl:36

(For defining in-place operations, .+= is much more powerful because it fuses with other âdotâ operations, and .+= can be customized via the broadcast machinery.)

I donât think .+= is suitable because here (and eventually in many other cases) the type is not a container (although might be seen as one), the required operation is on the type itself, not its elements.
just a quick benchmark shows a speed and space gain between the two versions (Note: renamed += => iadd):

yes but the Polynomial takes an arbitrary Real.
changing the type to be more specific

mutable struct Polynomial{T<:Real}
data::Dict{Int, T}
function Polynomial(data::Dict{Int, <:Real})
@assert all(n->n>=0, keys(data))
@assert all(!iszero, values(data))
new(data)
end
Polynomial() = new(Dict())
end

produces this error:

ERROR: LoadError: syntax: too few type parameters specified in "new{...}"
Stacktrace:
[1] include at .\boot.jl:317 [inlined]
[2] include_relative(::Module, ::String) at .\loading.jl:1044
[3] include(::Module, ::String) at .\sysimg.jl:29
[4] exec_options(::Base.JLOptions) at .\client.jl:231
[5] _start() at .\client.jl:425
in expression starting at C:\Users\User\Desktop\Tfach\julia_project\test.jl:5

Excuse me for asking, but isnât implementing polynomials using Dicts incredibly inefficient? It also seems to lead to very messy code. Why not use tuples or StaticArrays to hold the coefficients? Or maybe just plain vectors would be the easiest.

I donât know if itâs âincrediblyâ inneficient, but itâs just for academic purpose, I have a python exercice to implement polynomials using dicts and i wanted to reimplement it as is in julia.

Because your type is parametric and new needs that information:

mutable struct Polynomial{T <: Real}
data::Dict{Int, T}
function Polynomial(data::Dict{Int, W}) where {W <: Real}
@assert all(n->n>=0, keys(data))
@assert all(!iszero, values(data))
new{W}(data)
end
Polynomial{T}() where {T <: Real} = new{T}(Dict{Int, T}())
# and for a default of `Real`
# Polynomial() = Polynomial{Real}()
end

EDIT: Note that the type assertion T <: Real in the constructors is redundant - itâs already ensured by the type signature itself and technically doesnât need to be checked in the constructor again.

OK, thatâs fine. It just seems so much more obvious to use vectors (or tuples) which have integer indices by default, than to use dicts. I guess it can be useful if you have very widely separated terms, like 2x^3 + 11x^{74} or something.

Python loves dicts, but in Julia they are not necessarily the obvious goto solution.

In that case, would a SparseVector be good as well? This would need some testing, but Iâm feeling lucky today in saying it would be easier to use with existing multiplication etc.

The indices of the vector or tuple would be equivalent to the power at that position in the example by @DNF - for example, v = [3 0 0 2 0] would uniquely describe the polynomial 3x^4 + 2x.

A quick and dirty test using vectors gave me a 60x speedup using vectors and 10-15th power polynomials, both for construction and addition. If you have only small polynomials, you can probably get incredible speedups with tuples/staticvectors, though at the cost of more messing about with type parameters and maybe generated functions.