grep files from list
You seem to be grepping the list of filenames, not the files themselves. <(cat files.txt)
just lists the files. Try <(cat $(cat files.txt))
to actually concatenate them and search them as a single stream, or
grep -i 'foo' $(cat files.txt)
to give grep all the files.
However, if there are too many files on the list, you may have problems with number of arguments. In that case I'd just write
while read filename; do grep -Hi 'foo' "$filename"; done < files.txt
xargs grep -i -- foo /dev/null < files.txt
assuming files are blank or newline delimited (where quotes or backslashes can be used to escape those separators). With GNU xargs
you can specify the delimiter with -d
(which then disables the quoting handling though).
(unset -v IFS; set -f; grep -i -- foo $(cat files.txt))
assuming files are space, tab or newline separated (no way to escape those though you can choose a different separator by assigning it to IFS
). That one will fail if the file list is too big on most systems.
Those also assume that none of the files are called -
.
To read a list of file names from stdin you can use xargs
. E.g.,
cat files.txt | xargs -d'\n' grep -i -- 'foo'
By default, xargs
reads items from the standard input, delimited by blanks. The -d'\n'
tells it to use newline as the argument delimiter, so it can handle file names containing blanks. (As Stéphane Chazelas points out, that's a GNU extension). However, it won't cope with file names containing newlines; we'd need a slightly more complicated approach to handle those.
FWIW, this approach is somewhat faster than a while read
loop, as bash's read
command is very slow - it reads its data character by character, whereas xargs
reads its input more efficiently. Also, xargs
only invokes the grep
command as many times as it needs to, with each invocation receiving multiple file names, and that's more efficient than invoking grep
individually for each file name.
See the xargs man page and the xargs info page for further details.