Completely buffer command output before piping to another command?
This is similar to a couple of the other answers.
If you have the “moreutils” package, you should have the sponge
command. Try
commandA | sponge | { IFS= read -r x; { printf "%s\n" "$x"; cat; } | commandB; }
The sponge
command is basically a pass-through filter (like cat
)
except that it does not start writing the output until it has read the entire input.
I.e., it “soaks up” the data, and then releases it when you squeeze it (like a sponge).
So, to a certain extent, this is “cheating” –
if there’s a non-trivial amount of data, sponge
almost certainly uses a temporary file.
But it’s invisible to you; you don’t have to worry about housekeeping things
like choosing a unique filename and cleaning up afterwards.
The { IFS= read -r x; { printf "%s\n" "$x"; cat; } | commandB; }
reads the first line of output from sponge
.
Remember, this doesn’t appear until commandA
has finished.
Then it fires up commandB
, writes the first line to the pipe,
and invokes cat
to read the rest of the output and write it to the pipe.
Commands in pipe line are started concurrently, you need to store commandA
output somewhere to use later. You can avoid temp file by using variable:
output=$(command A; echo A)
printf '%s' "${output%%A}" | commandB