Workspace doesn't display UInt values

I’ve just started using Julia and I wanted something nice to start working in, so I’ve downloaded and installed JuliaPro. All good. But I’ve just noticed something that’s annoying me lots and I don’t know if it can be fixed.

I use the Workspace quite a bit as I’m writing code and it’s particularly useful as I try to transfer my python code across. However, it seems that unsigned ints cannot be displayed in the Workspace (see image) - is this correct? Is there a setting that needs to be changed? It’s super annoying to have to use println(fileversion) everywhere!

image

The workspace uses the normal Julia show methods for displaying UInts:

julia> UInt32(2621970)
0x00280212

Is there a way to change this so that it shows the actual Int value?

That is the actual value.

You can overwrite the relevant show method with

julia> Base.show(io::IO, n::Unsigned) = print(io, n)

julia> UInt32(123)
123

but that’s type privacy and thus not recommended.

Ok, thanks for the solution.

But I would argue that the actual value is the ‘Int’, whereas what is displayed is the hexadecimal representation of that ‘Int’. But I guess this is just how Julia does things.

The values are the same, just differently represented.
Personally, I don’t necessarily like that Unsigneds are printed in base-16, but Juno at least doesn’t do anything special here. We could show the base-10 representation in a tooltip or something, but Juno isn’t actively developed anymore, so that’s unlikely to happen :wink:

That’s exaclty what I should have said. For cursory checks, base-10 would be so much easier for simple brains like mine. And would remove lots of print statements that will inevitably get forgotten/lost.

Anything similar to Juno that is actively developed?

Yes: https://www.julia-vscode.org/

I’m not sure… Typically, Int is used for variables that represent a size or count, where base 10 is more natural indeed, and UInt is often used for values where individual bits or bytes have a meaning.

There are good reasons for this… See a recent discussion about why indices and length use Int.

Maybe it’s different in your case, but I see two masks in your list. Would they be more readable in base 10? (I guess not.)

For full_timestep maybe it would make sense to use an Int?

For fileversion it depends how the version is encoded… For example 0x00280212 could be the code for a version of the form “x.y” as two 16-bit values: 0x0028 and 0x0212. This would arguably be less readable in base 10.

2 Likes

You are correct about the masks, I’m not really interested in reading those at all, but there are many other variables that are also UInt that I want to see quickly but weren’t in the attached screenshot.

The historical context of Int vs UInt is interesting, but as these values are all attributes in HDF5 files that I’m reading from some 3rd party software, I just need to be able to read and check these attributes quickly.

1 Like

Then you should probably just do something like fileversion_signed = Int(fileversion) and check this variable in the workspace instead.

Yes, I’m now converting to an Int when I read the file and save it into a struct, but when exploring and protyping I would have prefererred to just view in the Workspace. It’s just a minor annoyance at this stage.