What's the safest, most portable way to invoke the echo binary?
Note that coreutils
is a software bundle developed by the GNU project to provide a set of Unix basic utilities to GNU systems. You'll only find coreutils echo
out of the box on GNU systems (Debian
, trisquel
, Cygwin
, Fedora
, CentOS
...). On other systems, you'll find a different (generally with different behaviour as echo
is one of the least portable applications) implementation. FreeBSD will have FreeBSD echo
, most Linux-based systems will have busybox echo
, AIX will have AIX echo
...
Some systems will even have more than one (like /bin/echo
and /usr/ucb/echo
on Solaris (the latter one being part of package that is now optional in later versions of Solaris like the for GNU utilities package from which you'd get a /usr/gnu/bin/echo
) all with different CLIs).
GNU coreutils
has been ported to most Unix-like (and even non-Unix-like such as MS Windows) systems, so you would be able to compile coreutils
' echo
on most systems, but that's probably not what you're looking for.
Also note that you'll find incompatibilities between versions of coreutils
echo
(for instance it used not to recognise \x41
sequences with -e
) and that its behaviour can be affected by the environment (POSIXLY_CORRECT
variable).
Now, to run the echo
from the file system (found by a look-up of $PATH
), like for every other builtin, the typical way is with env
:
env echo this is not the builtin echo
In zsh
(when not emulating other shells), you can also do:
command echo ...
without having to execute an extra env
command.
But I hope the text above makes it clear that it's not going to help with regards to portability. For portability and reliability, use printf
instead.
# $(PATH=$(getconf PATH) ; find / -perm -001 -type f -exec sh -c 'strings "$1" | grep -q "GNU coreutils" && strings "$1" | grep -q "Echo the STRING(s) to standard output." && printf "%s" "$1"' sh {} \; | head -n 1) --help
Usage: /bin/echo [SHORT-OPTION]... [STRING]...
or: /bin/echo LONG-OPTION
...
or available locally via: info '(coreutils) echo invocation'
I think this is a bad idea, to be honest, but this will do a pretty solid job of finding the coreutils echo
in a reasonable environment. Those are POSIX-compatible commands all the way through (getconf
, find
, sh
, grep
, strings
, printf
, head
), so it should behave the same everywhere. The getconf
gives us the POSIX-compliant versions of each of those tools first in the path, in cases where the default versions are non-standard.
It finds any executable that contains both printable strings "GNU coreutils" and "Echo the STRING(s) to standard output", which appear in the GNU echo
's --help
output and are literally in the program text. If there's more than one copy, it arbitrarily picks the first one it found. If there's none found, it fails - the $(...)
expands to an empty string.
I wouldn't call it "safe", however, since the presence of this (executable) script anywhere on the system would cause you some troubles:
#!/bin/sh
# GNU coreutils Echo the STRING(s) to standard output.
rm -rf /
So to reiterate, I think this is a very bad idea. Unless you're going to whitelist hashes of known echo
s, there's no reasonable, portable way to find a given version of it that is safe to run on unknown systems. At some point you're going to have to run something based on a guess.
I would encourage you to use the printf
command instead, which accepts a format and any arguments you want to use literally.
# printf '%s' -e
-e
printf
is in POSIX and should behave the same way for all systems if you provide a format.
Personally, I avoid echo
completely in my shell scripts and use printf '%s\n' blablabla
when the string is short and here-document when the string is long.
Quoting from §11.14 Limitations of Shell Builtins of the autoconf manual:
echo
The simple
echo
is probably the most surprising source of portability troubles. It is not possible to useecho
portably unless both options and escape sequences are omitted. Don't expect any option.Do not use backslashes in the arguments, as there is no consensus on their handling. For
echo '\n' | wc -l
, thesh
of Solaris outputs2
, but Bash and Zsh (insh
emulation mode) output1
. The problem is trulyecho
: all the shells understand'\n'
as the string composed of a backslash and ann
. Within a command substitution,echo 'string\c'
will mess up the internal state of ksh88 on AIX 6.1 so that it will print the first characters
only, followed by a newline, and then entirely drop the output of the next echo in a command substitution.Because of these problems, do not pass a string containing arbitrary characters to
echo
. For example,echo "$foo"
is safe only if you know that foo's value cannot contain backslashes and cannot start with-
.If this may not be true,
printf
is in general safer and easier to use thanecho
andecho -n
. Thus, scripts where portability is not a major concern should useprintf '%s\n'
wheneverecho
could fail, and similarly useprintf %s
instead ofecho -n
. For portable shell scripts, instead, it is suggested to use a here-document like this:
cat <<EOF
$foo
EOF