limit find output AND avoid signal 13
Since you're already using GNU extensions (-quit
, -H
, -m1
), you might as well use GNU grep
's -r
option, together with --line-buffered
so it outputs the matches as soon as they are found, so it's more likely to be killed of a SIGPIPE as soon as it writes the 6th line:
grep -rHm1 --line-buffered pattern /path | head -n 5
With find
, you'd probably need to do something like:
find /path -type f -exec sh -c '
grep -Hm1 --line-buffered pattern "$@"
[ "$(kill -l "$?")" = PIPE ] && kill -s PIPE "$PPID"
' sh {} + | head -n 5
That is, wrap grep
in sh
(you still want to run as few grep
invocations as possible, hence the {} +
), and have sh
kill its parent (find
) when grep
dies of a SIGPIPE.
Another approach could be to use xargs
as an alternative to -exec {} +
. xargs
exits straight away when a command it spawns dies of a signal so in:
find . -type f -print0 |
xargs -r0 grep -Hm1 --line-buffered pattern |
head -n 5
(-r
and -0
being GNU extensions). As soon as grep
writes to the broken pipe, both grep
and xargs
will exit and find
will exit itself as well the next time it prints something after that. Running find
under stdbuf -oL
might make it happen sooner.
A POSIX version could be:
trap - PIPE # restore default SIGPIPE handler in case it was disabled
RE=pattern find /path -type f -exec sh -c '
for file do
awk '\''
$0 ~ ENVIRON["RE"] {
print FILENAME ": " $0
exit
}'\'' < "$file"
if [ "$(kill -l "$?")" = PIPE ]; then
kill -s PIPE "$PPID"
exit
fi
done' sh {} + | head -n 5
Very inefficient as it runs several commands for each file.
A solution to avoid the errors could be this:
find / -type f -print0 \
| xargs -0 -L 1 grep -H -m 1 --line-buffered 2>/dev/null \
| head -10
In this example, xargs will stop once the command fails, so there'll be just one pipe error, which will be filtered by the stderr redirection.