How can we run a command stored in a variable?
This has been discussed in a number of questions on unix.SE, I'll try to collect all issues I can come up with here. References at the end.
Why it fails
The reason you face those problems is word splitting and the fact that quotes expanded from variables don't act as quotes, but are just ordinary characters.
The cases presented in the question:
The assignment here assigns the single string ls -l "/tmp/test/my dir"
to abc
:
$ abc='ls -l "/tmp/test/my dir"'
Below, $abc
is split on whitespace, and ls
gets the three arguments -l
, "/tmp/test/my
and dir"
(with a quote at the front of the second and another at the back of the third). The option works, but the path gets incorrectly processed:
$ $abc
ls: cannot access '"/tmp/test/my': No such file or directory
ls: cannot access 'dir"': No such file or directory
Here, the expansion is quoted, so it's kept as a single word. The shell tries to find a program literally called ls -l "/tmp/test/my dir"
, spaces and quotes included.
$ "$abc"
bash: ls -l "/tmp/test/my dir": No such file or directory
And here, $abc
is split, and only the first resulting word is taken as the argument to -c
, so Bash just runs ls
in the current directory. The other words are arguments to bash, and are used to fill $0
, $1
, etc.
$ bash -c $abc
'my dir'
With bash -c "$abc"
, and eval "$abc"
, there's an additional shell processing step, which does make the quotes work, but also causes all shell expansions to be processed again, so there's a risk of accidentally running e.g. a command substitution from user-provided data, unless you're very careful about quoting.
Better ways to do it
The two better ways to store a command are a) use a function instead, b) use an array variable (or the positional parameters).
Using a function:
Simply declare a function with the command inside, and run the function as if it were a command. Expansions in commands within the function are only processed when the command runs, not when it's defined, and you don't need to quote the individual commands.
# define it
myls() {
ls -l "/tmp/test/my dir"
}
# run it
myls
Using an array:
Arrays allow creating multi-word variables where the individual words contain white space. Here, the individual words are stored as distinct array elements, and the "${array[@]}"
expansion expands each element as separate shell words:
# define the array
mycmd=(ls -l "/tmp/test/my dir")
# run the command
"${mycmd[@]}"
The syntax is slightly horrible, but arrays also allow you to build the command line piece-by-piece. For example:
mycmd=(ls) # initial command
if [ "$want_detail" = 1 ]; then
mycmd+=(-l) # optional flag
fi
mycmd+=("$targetdir") # the filename
"${mycmd[@]}"
or keep parts of the command line constant and use the array fill just a part of it, like options or filenames:
options=(-x -v)
files=(file1 "file name with whitespace")
target=/somedir
transmutate "${options[@]}" "${files[@]}" "$target"
The downside of arrays is that they're not a standard feature, so plain POSIX shells (like dash
, the default /bin/sh
in Debian/Ubuntu) don't support them (but see below). Bash, ksh and zsh do, however, so it's likely your system has some shell that supports arrays.
Using "$@"
In shells with no support for named arrays, one can still use the positional parameters (the pseudo-array "$@"
) to hold the arguments of a command.
The following should be portable script bits that do the equivalent of the code bits in the previous section. The array is replaced with "$@"
, the list of positional parameters. Setting "$@"
is done with set
, and the double quotes around "$@"
are important (these cause the elements of the list to be individually quoted).
First, simply storing a command with arguments in "$@"
and running it:
set -- ls -l "/tmp/test/my dir"
"$@"
Conditionally setting parts of the command line options for a command:
set -- ls
if [ "$want_detail" = 1 ]; then
set -- "$@" -l
fi
set -- "$@" "$targetdir"
"$@"
Only using "$@"
for options and operands:
set -- -x -v
set -- "$@" file1 "file name with whitespace"
set -- "$@" /somedir
transmutate "$@"
(Of course, "$@"
is usually filled with the arguments to the script itself, so you'll have to save them somewhere before re-purposing "$@"
.)
Be careful with eval
!
As eval
introduces an additional level of quote and expansion processing, you need to be careful with user input.
For example, this works as long as the user doesn't type in any single quotes:
read -r filename
cmd="ls -l '$filename'"
eval "$cmd";
But if they give the input '$(uname)'.txt
, your script happily runs the command substitution.
A version with arrays is immune to that since the words are kept separate for the whole time, there's no quote or other processing for the contents of filename
.
read -r filename
cmd=(ls -ld -- "$filename")
"${cmd[@]}"
References
- Word Splitting in BashGuide
- BashFAQ/050 or "I'm trying to put a command in a variable, but the complex cases always fail!"
- The question Why does my shell script choke on whitespace or other special characters?, which discusses a number of issues related to quoting and whitespace, including storing commands.
The safest way to run a (non-trivial) command is eval
. Then you can write the command as you would do on the command line and it is executed exactly as if you had just entered it. But you have to quote everything.
Simple case:
abc='ls -l "/tmp/test/my dir"'
eval "$abc"
not so simple case:
# command: awk '! a[$0]++ { print "foo: " $0; }' inputfile
abc='awk '\''! a[$0]++ { print "foo: " $0; }'\'' inputfile'
eval "$abc"
The second quote sign break the command.
When I run:
abc="ls -l '/home/wattana/Desktop'"
$abc
It gave me an error.
But when I run
abc="ls -l /home/wattana/Desktop"
$abc
There is no error at all
There is no way to fix this at the time(for me) but you can avoid the error by not having space in directory name.
This answer said the eval command can be used to fix this but it doesn't work for me :(