Ccall printf works with ints but not with floats. Why?

julia> ccall(:printf, Cint, (Ptr{Cchar}, Cint), "aiai %d", 10);
aiai 10
julia> ccall(:printf, Cint, (Ptr{Cchar}, Cdouble), "aiai %f", 10.99);
aiai 0.000000

The context for this is to reduce invalidations that cause latency.

using SnoopCompile
precompile(plot, (Array{Float64,2},));

tinf = @snoopi tmin=0.01 plot([0.0 0; 1.1 1])
5-element Vector{Tuple{Float64, Core.MethodInstance}}:
 (0.015999794006347656, MethodInstance for GMT.check_caller(::Dict{Symbol, Any}, ::String, ::String, ::String, ::String, ::Nothing, ::Bool))
 (0.03399991989135742, MethodInstance for hvcat(::Tuple{Int64, Int64}, ::Float64, ::Vararg{Number, N} where N))
 (0.0409998893737793, MethodInstance for Printf.format(::Vector{UInt8}, ::Int64, ::Printf.Format{Base.CodeUnits{UInt8, String}, NTuple{4, Printf.Spec{Val{'g'}}}}, ::Float64, ::Vararg{Float64, N} where N))
 (0.07500004768371582, MethodInstance for GMT.add_opt(::Dict{Symbol, Any}, ::String, ::Char, ::Matrix{Symbol}, ::Symbol, ::Vector{Union{Nothing, Matrix{Float64}}}, ::NamedTuple{(:outline, :fill), Tuple{String, String}}))
 (0.24000000953674316, MethodInstance for Printf.format(::Printf.Format{Base.CodeUnits{UInt8, String}, NTuple{4, Printf.Spec{Val{'g'}}}}, ::Float64, ::Float64, ::Float64, ::Vararg{Float64, N} where N))

So Printf is responsible for some non-negligible invalidation. The above happens in a line with

opt_R = @sprintf(" -R%.12g/%.12g/%.12g/%.12g", info[1].data[1], info[1].data[2],
							 info[1].data[3], info[1].data[4])

I checked that because if I replace the above with a opt_R = " -R1/2/3/4" the recompilation goes away. So my idea was to try to access the libc sprintf directly but I can’t make it work for floats. only for ints.

Hmm, this seems to be a Windows issue. If I use @ccall it doesn’t work either

julia> str = "                   ";

julia> @ccall sprintf(str::Cstring, "%s = %d"::Cstring ; "foo"::Cstring, 6::Cint)::Cint;

julia> str
"foo = 6\0           "

julia> str = "                   ";

julia> @ccall sprintf(str::Cstring, "%s = %f"::Cstring ; "foo"::Cstring, 6.6666::Cdouble)::Cint;

julia> str
"foo = f\0           "

but in WSL

julia> str = "                   ";

julia> @ccall sprintf(str::Cstring, "%s = %f"::Cstring ; "foo"::Cstring, 6.6666::Cdouble)::Cint;

julia> str
"foo = 6.666600\0    "

printf is vararg, calling vararg functions is tricky.

That is why I tried the @ccall macro that supports vararg

I am not really sure about printf, but for sprintf I do the following on Windows 10 (Julia 1.5):

strp = Ref{Ptr{Cchar}}(0)
len = ccall(:asprintf,Cint,(Ptr{Ptr{Cchar}},Cstring,Cdouble...),strp,fmt,z)
str = unsafe_string(strp[],len)
Libc.free(strp[])

where fmt is a traditional formatting string, z is the number I want to print and str is the output.

Maybe of some help.

Those aren’t invalidations, they are triggers of type-inference. @snoopr is what measures invalidations.
https://timholy.github.io/SnoopCompile.jl/stable/snoopr/

You can eliminate most of that time by adding precompiles in your package for whatever function has that @sprintf, but those will work only if the call to Printf.format is inferrable. Check @code_typed and see whether it’s an invoke (inferrable) or a call (non-inferrable).

1 Like

Thanks, but I don’t know how to do that. Those @sprintf live in a common function that is called by the GMT plot module as well as several other modules. That function does a first round reading the input data, finds the BoundingBox and creates a (string) option with that info. The precompile

precompile(plot, (Array{Float64,2},));

should exercise that function, and in fact it seems to do it in Julia 1.5 where this latency (or how should I call it) does not exist. But it does on Julia 1.6 where the whole GMT package gets slower (to start).

   _ _   _| |_  __ _   |  Type "?" for help, "]?" for Pkg help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 1.7.0-DEV.54 (2020-12-13)
 _/ |\__'_|_|_|\__'_|  |  Commit 6ddc7f1ecc (15 days old master)
|__/                   |

julia> @time using GMT
[ Info: Precompiling GMT [5752ebe1-31b9-557e-87aa-f909b540aa54]
 18.023743 seconds (2.24 M allocations: 137.498 MiB, 0.22% gc time, 5.17% compilation time)

julia> @time plot([0.0 0; 1.1 1])
  4.668330 seconds (3.76 M allocations: 176.236 MiB, 2.92% gc time, 99.39% compilation time)

and in 1.5

   _ _   _| |_  __ _   |  Type "?" for help, "]?" for Pkg help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 1.5.3 (2020-11-09)
 _/ |\__'_|_|_|\__'_|  |  Official https://julialang.org/ release
|__/                   |

julia> @time using GMT
[ Info: Precompiling GMT [5752ebe1-31b9-557e-87aa-f909b540aa54]
 17.095942 seconds (1.77 M allocations: 99.291 MiB, 0.23% gc time)

julia> @time plot([0.0 0; 1.1 1])
  3.799247 seconds (1.04 M allocations: 54.201 MiB, 0.30% gc time)

Thanks, I’ll certainly explore it if the apparent 1.6 regression in @sprintf steps in the way.

1 Like