Flushing output for Logging immediately without buffering

I’ve got a problem similar to this question: How to flush stdout where I want to log output from a long running script, usually to stderr, and check on it with tail -f
If I run it and redirect the output to a file with either println or @info from the Logging package, it will buffer all the output until the script finishes and write it all at the end.

I saw that this question Issues with println buffering output when redirecting stdout to a txt file had some ideas about making a FlushingIO type, which works for the println case.

Is there a similar way to extend Logging.info to always flush right away? Or alternatively, how can I make sure my logging output doesn’t wait until the script finishes?

You can test the buffering with this small script:

$ cat testprint.jl
using Logging

struct FlushingIO{T}
    io::T
end

function Base.println(io::FlushingIO, args...)
    println(io.io, args...)
    flush(io.io)
end

out = FlushingIO(stderr)

println(stderr, "a")
sleep(1)
println(out, "b")
sleep(1)
@info "c"
sleep(1)
@warn ("d")
sleep(1)

julia testprint.jl &> test.log & tail -f test.log

5 Likes

Sorry to bump this, but I’ve got the same problem using Base logging (@info etc). It seems to buffer stdout for many lines before writing to the stream. I’d rather avoid having to put flush(stdout) after every log command. Any better ideas?

2 Likes

I have the same issue. Any suggestions?

1 Like

I have the same problem. Is there a way to control the buffer or is it required to execute

flush(io)

after each log event? That doesn’t feel right :thinking:

Just in case someone stumbles over this issue in future, the solution is the “FileLogger” and/or “FormatLogger” of the Package GitHub - JuliaLogging/LoggingExtras.jl: Composable Loggers for the Julia Logging StdLib

Those loggers flush the buffer after each log message to the io stream.

2 Likes