I want to make a type that internally stores values in a `Complex`

form. I assumed that I should simply convert any `Real`

input. However, some math constants give an error doing this:

```
julia> Complex(2.0f0)
2.0f0 + 0.0f0im
julia> Complex(pi/2)
1.5707963267948966 + 0.0im
julia> Complex(pi)
ERROR: MethodError: no method matching Irrational{:π}(::Int64)
...
Stacktrace:
[1] convert(::Type{Irrational{:π}}, ::Int64) at ./number.jl:7
[2] oftype(::Irrational{:π}, ::Int64) at ./essentials.jl:334
[3] zero(::Irrational{:π}) at ./number.jl:262
[4] Complex(::Irrational{:π}) at ./complex.jl:16
[5] top-level scope at none:0
```

The same happens for `ℯ`

. I caught it only because of a very arbitrary unit test. I’m not saying this behavior is irrational (har har!), but it is arguably not intuitive, so I would love to know if it’s intentional.

For the `Real`

types I’ve thought to try, it does always work to add `0im`

. Not a big deal, but it feels a bit non-Julian to use addition for what is fundamentally a type conversion. I’m open to equally simple yet general alternatives.