Force flushing of output to a file while bash script is still running
I found a solution to this here. Using the OP's example you basically run
stdbuf -oL /homedir/MyScript &> some_log.log
and then the buffer gets flushed after each line of output. I often combine this with nohup
to run long jobs on a remote machine.
stdbuf -oL nohup /homedir/MyScript &> some_log.log
This way your process doesn't get cancelled when you log out.
You can use tee
to write to the file without the need for flushing.
/homedir/MyScript 2>&1 | tee some_log.log > /dev/null
bash itself will never actually write any output to your log file. Instead, the commands it invokes as part of the script will each individually write output and flush whenever they feel like it. So your question is really how to force the commands within the bash script to flush, and that depends on what they are.
script -c <PROGRAM> -f OUTPUT.txt
Key is -f. Quote from man script:
-f, --flush
Flush output after each write. This is nice for telecooperation: one person
does 'mkfifo foo; script -f foo', and another can supervise real-time what is
being done using 'cat foo'.
Run in background:
nohup script -c <PROGRAM> -f OUTPUT.txt