Change multiple files

You could use grep and sed together. This allows you to search subdirectories recursively.

Linux: grep -r -l <old> * | xargs sed -i 's/<old>/<new>/g'
OS X: grep -r -l <old> * | xargs sed -i '' 's/<old>/<new>/g'

For grep:
    -r recursively searches subdirectories 
    -l prints file names that contain matches
For sed:
    -i extension (Note: An argument needs to be provided on OS X)

Those commands won't work in the default sed that comes with Mac OS X.

From man 1 sed:

-i extension
             Edit files in-place, saving backups with the specified
             extension.  If a zero-length extension is given, no backup 
             will be saved.  It is not recommended to give a zero-length
             extension when in-place editing files, as you risk corruption
             or partial content in situations where disk space is exhausted, etc.

Tried

sed -i '.bak' 's/old/new/g' logfile*

and

for i in logfile*; do sed -i '.bak' 's/old/new/g' $i; done

Both work fine.


I'm surprised nobody has mentioned the -exec argument to find, which is intended for this type of use-case, although it will start a process for each matching file name:

find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} \;

Alternatively, one could use xargs, which will invoke fewer processes:

find . -type f -name 'xa*' | xargs sed -i 's/asd/dsg/g'

Or more simply use the + exec variant instead of ; in find to allow find to provide more than one file per subprocess call:

find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} +

Better yet:

for i in xa*; do
    sed -i 's/asd/dfg/g' $i
done

because nobody knows how many files are there, and it's easy to break command line limits.

Here's what happens when there are too many files:

# grep -c aaa *
-bash: /bin/grep: Argument list too long
# for i in *; do grep -c aaa $i; done
0
... (output skipped)
#

Tags:

Sed