Continuous provision of current logs

Hello all,

I want to continuously provide the logs in a cloud instance that are the output in a long running processes (optimization problems). For this I tried to run asynchronously and also on a separate thread the function that drives these continuous updates. During this update I upload the file to a GCP bucket to access the log from another resource.

Here is a code example:

using Logging

io = open("/home/logs.txt", "w+")
logger = SimpleLogger(io)
global_logger(logger)

function run_logger!(io::IOStream)
    counter = 0
    while isopen(io)
        # flush the io stream to the file
        @info("Updated $counter times on thread $(Threads.threadid())")
        flush(io)
        ###
        # uploading to gcp
        ###
        # only updates every 5 second
        sleep(5)
        counter += 1
    end
    @info("Parallel Logger closed.")
    nothing
end

# async test
@async run_logger!(io)

# multithreaded test
Threads.@spawn run_logger!(io)

# async and multithreaded test
Threads.@spawn @async run_logger!(io)

If I now run the instance, the log is updated 2 times in parallel at the beginning. After that it will be updated again after all processes are finished via another function. Also the while loop is not closed during this time. Can anyone tell me what the problem is or how I manage to provide the logs continuously?

I don’t really understand what you are trying to examplify with the @spawn and @async, but perhaps all of that is moot if you simply use a FileLogger that automatically flushes every message.

julia> Threads.nthreads()
4

julia> using Logging, LoggingExtras

julia> logger = FileLogger("logs.txt");

julia> global_logger(logger);

julia> @sync for i in 1:4
           Threads.@spawn @info "Hello from thread $(Threads.threadid())"
       end

shell> cat logs.txt
┌ Info: Hello from thread 1
└ @ Main REPL[5]:2
┌ Info: Hello from thread 3
└ @ Main REPL[5]:2
┌ Info: Hello from thread 4
└ @ Main REPL[5]:2
┌ Info: Hello from thread 2
└ @ Main REPL[5]:2

Thanks for the reply @fredrikekre .
Sadly this does not solve my issue. What I didn’t mention in my request is that I’m also uploading this file to access it from another resource. This currently happens right after the ‘flush(io)’ statement.

I update my initial post, so this will become more clear.

Okay, I don’t really understand what you are trying to do then. Can you come up with a self-contained example which demonstrates the issue? E.g. instead of uploading the log file you copy it to disk or something like that.

Here is an example with copying the file.

using Logging

io = open("/home/logs.txt", "w+")
logger = SimpleLogger(io)
global_logger(logger)

function copy_log()
    f = open("/home/online_log.txt", "w")
    write(f, read("/home/logs.txt", String))
    close(f)
end

function run_logger!(io::IOStream)
    counter = 0
    while isopen(io)
        # flush the io stream to the file
        @info("Updated $counter times on thread $(Threads.threadid())")
        flush(io)
        ###
        copy_log()
        ###
        # only updates every 5 second
        sleep(5)
        counter += 1
    end
    @info("Parallel Logger closed.")
    nothing
end

# async test
@async run_logger!(io)

# multithreaded test
Threads.@spawn run_logger!(io)

# async and multithreaded test
Threads.@spawn @async run_logger!(io)

Since the while loop is not closed in the instance I assume it might be an similar issue to this one: