Opening and closing file overhead



Hello everyone,

I`m somehow new to programming. I have a question. Which algorithm has a better speed?
which one do you recommend?

function main()
   opening a file
   for loop 
       <doing some stuf>
        writing something to file
   end loop
   closing file

end of function main


function printer(What should be written)
     opening file
     writing in file
     closing file

function main()
   for loop
       <doing some stuf>
        printer(What should be written)
   end loop
end of function main


Optimization is mostly about to as little as possible so you should avoid keep opening and closing files unless the resource consumed by keeping one file open all the time during <doing some stuff> is somehow unacceptable (usually not the case).


I have found this answer convincing.
but I have a question. when I keep the file open during the loop. I want to see what is writing inside the file so I use the

tail -f FILENAME

command in the Linux operation system. I see nothing. Is that mean that write()function is doing nothing unless it reaches the closefile function or I`m missing something?


The output is likely being buffered (that’s true for most I/O libraries in different languages).
You’d need to flush the output before it can be seen by other processes.


yes, I am aware of that. I am using flush(STDOUT); right now but the problem still exist. maybe I`m flushing the wrong buffer?


Yes, you need to flush the buffer you’re writing to like so:

open("foo.txt", "w") do io
  write(io, "hi")


Thanks. I will do that.
how can I say that write something at the end of the opened file and do not overwrite the file?


open("foo.txt", "a") do io


Thanks. :slight_smile: