Weird datatype behaviour

I’m trying to create a 24-bit datatype for a SIC/XE assembly simulator. So far I have this code:

import Core.Intrinsics

# defining a primitive 24-bit integer type for integer registers
primitive type SICInt 24 end

# other signed types -> SICInt
SICInt(x::Int) = Core.Intrinsics.trunc_int(SICInt, x)

# SICInt -> other signed types
Int32(x::SICInt) = Core.Intrinsics.sext_int(Int32, x)
Int64(x::SICInt) = Core.Intrinsics.sext_int(Int64, x)
Int(x::SICInt) = Core.Intrinsics.sext_int(Int, x)

# printing help, x::SICInt) = print(io, Int(x))

# Math operations

# machine data structure
mutable struct Machine

    # registers
    F::Float64 # TODO

saved in a file called machine.jl.
Weird behaviour happens when I try to load the code into REPL:

see: 2 became of type Core.Int64 instead of Int64.
Strangely, this doesn’t happen if I call typeof(2) before the include statement:

Does anyone have any idea what could be going wrong here or how could I fix this?
Thank you very much.

First things first, Int64 and Core.Int64 are the same type, so nothing has actually changed type:

julia> Int64 === Core.Int64

However, you’re defining:

Int64(x::SICInt) = Core.Intrinsics.sext_int(Int64, x)

but you never did import Core: Int64, so your definition is actually creating a totally new function that just happens to have the name Int64.

The last part is why the printing of Int64 changes. I don’t totally understand the code that controls it (you can find it here: ), but it seems that Julia tries to abbreviate Core.Int64 to Int64 since Int64 is defined in the Main module and is identical to Core.Int64. After your code is imported, there’s a brand new function called Int64, which has nothing to do with Core.Int64, so Julia has to print the entire Core.Int64 when showing the type name.

Now there’s a further question of why that behavior depends on the order of operations, which I admit I don’t totally understand.

So, to summarize:

  • Nothing has changed type, just the way some types are printed was changed to avoid conflicting with your Int64 and other definitions…
  • … but the real problem is that you presumably meant to do Core.Int64(...) = ... or import Core: Int64; Int64(...) = ... . I would recommend the former, as it’s more obvious to future readers of the code what is going on.
1 Like

Thank you, adding the line

import Core: Int64, Int32

solved the problem.