How can I generate arguments to another command via command substitution
I wrote a script which can generate those arguments for me, with quotes
If the output is properly quoted for the shell, and you trust the output, then you could run eval
on it.
Assuming you have a shell that supports arrays, it would be best to use one to store the arguments you get.
If ./gen_args.sh
produces output like 'foo bar' '*' asdf
, then we could run eval "args=( $(./gen_args.sh) )"
to populate an array called args
with the results. That would be the three elements foo bar
, *
, asdf
.
We can use "${args[@]}"
as usual to expand the array elements individually:
$ eval "args=( $(./gen_args.sh) )"
$ for var in "${args[@]}"; do printf ":%s:\n" "$var"; done
:foo bar:
:*:
:asdf:
(Note the quotes. "${array[@]}"
expands to all elements as distinct arguments unmodified. Without quotes the array elements are subject to word splitting. See e.g. the Arrays page on BashGuide.)
However, eval
will happily run any shell substitutions, so $HOME
in the output would expand to your home directory, and a command substitution would actually run a command in the shell running eval
. An output of "$(date >&2)"
would create a single empty array element and print the current date on stdout. This is a concern if gen_args.sh
gets the data from some untrusted source, like another host over the network, file names created by other users. The output could include arbitrary commands. (If get_args.sh
itself was malicious, it wouldn't need to output anything, it could just run the malicious commands directly.)
An alternative to shell quoting, which is hard to parse without eval, would be to use some other character as separator in the output of your script. You'd need to pick one that is not needed in the actual arguments.
Let's choose #
, and have the script output foo bar#*#asdf
. Now we can use unquoted command expansion to split the output of the command to the arguments.
$ IFS='#' # split on '#' signs
$ set -f # disable globbing
$ args=( $( ./gen_args3.sh ) ) # assign the values to the array
$ for var in "${args[@]}"; do printf ":%s:\n" "$var"; done
:foo bar:
:*:
:asdf:
You'll need to set IFS
back later if you depend on word splitting elsewhere in the script (unset IFS
should work to make it the default), and also use set +f
if you want to use globbing later.
If you're not using Bash or some other shell that has arrays, you could use the positional parameters for that. Replace args=( $(...) )
with set -- $(./gen_args.sh)
and use "$@"
instead of "${args[@]}"
then. (Here, too, you need quotes around "$@"
, otherwise the positional parameters are subject to word splitting.)
The issue is that once your somecommand
script outputs the options for othercommand
, the options are really just text and at the mercy of the shell's standard parsing (affected by whatever $IFS
happens to be and what shell options are in effect, which you in the general case would not be in control over).
Instead of using somecommand
to output the options, it would be easier, safer, and more robust to use it to call othercommand
. The somecommand
script would then be a wrapper script around othercommand
instead of some sort of helper script that you would have to remember to call in some special way as part of the command line of otherscript
. Wrapper scripts are a very common way of providing a tool that just calls some other similar tool with another set of options (just check with file
what commands in /usr/bin
are actually shell script wrappers).
In bash
, ksh
or zsh
, you could easily a wrapper script that uses an array to hold the individual options for othercommand
like so:
options=( "hi there" "nice weather" "here's a star" "*" )
options+=( "bonus bumblebee!" ) # add additional option
Then call othercommand
(still within the wrapper script):
othercommand "${options[@]}"
The expansion of "${options[@]}"
would ensure that each element of the options
array is individually quoted and presented to othercommand
as separate arguments.
The user of the wrapper would be oblivious to the fact that it's actually calling othercommand
, something that would not be true if the script instead just generated the command line options for othercommand
as output.
In /bin/sh
, use $@
to hold the options:
set -- "hi there" "nice weather" "here's a star" "*"
set -- "$@" "bonus bumblebee!" # add additional option
othercommand "$@"
(set
is the command used for setting the positional parameters $1
, $2
, $3
etc. These are what makes up the array $@
in a standard POSIX shell. The initial --
is to signal to set
that there are no options given, only arguments. The --
is really only needed if the first value happens to be something starting with -
).
Note that it's the double quotes around $@
and ${options[@]}
that ensures that the elements are not individually word-splitted (and filename globbed).
If the somecommand
output is in reliably good shell syntax, you can use eval
:
$ eval sh test.sh $(echo '"hello " "hi and bye"')
hello
hi and bye
But you have to be sure that the output has valid quoting and such, otherwise you might end up running commands outside the script as well:
$ cat test.sh
for var in "$@"
do
echo "|$var|"
done
$ ls
bar baz test.sh
$ eval sh test.sh $(echo '"hello " "hi and bye"; echo rm *')
|hello |
|hi and bye|
rm bar baz test.sh
Note that echo rm bar baz test.sh
wasn't passed to the script (because of the ;
) and was run as a separate command. I added the |
around $var
to highlight this.
Generally, unless you can completely trust the output of somecommand
, it's not possible to reliably use its output to build a command string.