Using jq within pipe chain produces no output
The output from jq
is buffered when its standard output is piped.
To request that jq
flushes its output buffer after every object, use its --unbuffered
option, e.g.
tail -f in.txt | jq --unbuffered '.f1' | tee out.txt
From the jq
manual:
--unbuffered
Flush the output after each JSON object is printed (useful if you're piping a slow data source into
jq
and pipingjq
's output elsewhere).
What you're seeing here is the C stdio buffering in action. It will store output on a buffer until it reaches a certain limit (might be 512 bytes, or 4KB or larger) and then send that all at once.
This buffering gets disabled automatically if stdout is connected to a terminal, but when it's connected to a pipe (such as in your case), it will enable this buffering behavior.
The usual way to disable/control buffering is using the setvbuf()
function (see this answer for more details), but that would need to be done in the source code of jq
itself, so maybe not something practical for you...
There's a workaround... (A hack, one might say.) There's a program called "unbuffer", that's distributed with "expect" that can create a pseudo-terminal and connect that to a program. So, even though jq
will still be writing to a pipe, it will think it's writing to a terminal, and the buffering effect will be disabled.
Install the "expect" package, which should come with "unbuffer", if you don't already have it... For instance, on Debian (or Ubuntu):
$ sudo apt-get install expect
Then you can use this command:
$ tail -f in.txt | unbuffer -p jq '.f1' | tee out.txt
See also this answer for some more details on "unbuffer", and you can find a man page here too.