Execute a command with a Bash variable in it and store the result

If you need to do some kind of transformation on the data first, you can "capture" output with the following syntax:

result="$(iconv -f ISO-8859 -t UTF-8 $1)"

There is a gotcha here as well: if you are going to be storing large amounts of data with potential whitespace or other meddlesome characters in it, be sure to always quote the variable ("$result" instead of $result) to ensure it gets treated as a single string.


I'd do as such:

while read filename; 
do
    mv "$filename" "$filename.bck" && \
        iconv -f ISO-8859 -t UTF-8 "$filename.bck" > "$filename"
done < find -iname '*.[hc]'

This creates backups on the fly and also handles files with whitespace (not newline characters).


Here is a solution that even handles newlines:

find -name '*.[ch]' \
    -exec mv '{}' '{}.backup' \; \
    -exec iconv -f ISO-8859 -t UTF-8 '{}.backup' -o '{}' \;

Generally, never parse filenames if you are going to use the results. The only sane ways I know of are

  1. Use shell globs, e.g. for file in ./*.[ch] ; do echo "$file" ; done. Only works for one directory.
  2. Use find in combination with -exec
  3. Use 'find' in combination with -print0 (which prints the filenames as \0-separated strings) and use the output to build commandlines with xargs -0 and probably a helper script. This is quite cumbersome, though.

Also, make sure that relative filenames you use are prefixed with ./. Calling mv -from -to isn't safe, but mv ./-from ./-to is, and does what you want. E.g. when globbing, go with ./*.c rather than *.c.

Tags:

Unix

Shell

Bash