Running shell command and capturing the output
This is way easier, but only works on Unix (including Cygwin) and Python2.7.
import commands
print commands.getstatusoutput('wc -l file')
It returns a tuple with the (return_value, output).
For a solution that works in both Python2 and Python3, use the subprocess
module instead:
from subprocess import Popen, PIPE
output = Popen(["date"],stdout=PIPE)
response = output.communicate()
print response
Something like that:
def runProcess(exe):
p = subprocess.Popen(exe, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while(True):
# returns None while subprocess is running
retcode = p.poll()
line = p.stdout.readline()
yield line
if retcode is not None:
break
Note, that I'm redirecting stderr to stdout, it might not be exactly what you want, but I want error messages also.
This function yields line by line as they come (normally you'd have to wait for subprocess to finish to get the output as a whole).
For your case the usage would be:
for line in runProcess('mysqladmin create test -uroot -pmysqladmin12'.split()):
print line,
In all officially maintained versions of Python, the simplest approach is to use the subprocess.check_output
function:
>>> subprocess.check_output(['ls', '-l'])
b'total 0\n-rw-r--r-- 1 memyself staff 0 Mar 14 11:04 files\n'
check_output
runs a single program that takes only arguments as input.1 It returns the result exactly as printed to stdout
. If you need to write input to stdin
, skip ahead to the run
or Popen
sections. If you want to execute complex shell commands, see the note on shell=True
at the end of this answer.
The check_output
function works in all officially maintained versions of Python. But for more recent versions, a more flexible approach is available.
Modern versions of Python (3.5 or higher): run
If you're using Python 3.5+, and do not need backwards compatibility, the new run
function is recommended by the official documentation for most tasks. It provides a very general, high-level API for the subprocess
module. To capture the output of a program, pass the subprocess.PIPE
flag to the stdout
keyword argument. Then access the stdout
attribute of the returned CompletedProcess
object:
>>> import subprocess
>>> result = subprocess.run(['ls', '-l'], stdout=subprocess.PIPE)
>>> result.stdout
b'total 0\n-rw-r--r-- 1 memyself staff 0 Mar 14 11:04 files\n'
The return value is a bytes
object, so if you want a proper string, you'll need to decode
it. Assuming the called process returns a UTF-8-encoded string:
>>> result.stdout.decode('utf-8')
'total 0\n-rw-r--r-- 1 memyself staff 0 Mar 14 11:04 files\n'
This can all be compressed to a one-liner if desired:
>>> subprocess.run(['ls', '-l'], stdout=subprocess.PIPE).stdout.decode('utf-8')
'total 0\n-rw-r--r-- 1 memyself staff 0 Mar 14 11:04 files\n'
If you want to pass input to the process's stdin
, you can pass a bytes
object to the input
keyword argument:
>>> cmd = ['awk', 'length($0) > 5']
>>> ip = 'foo\nfoofoo\n'.encode('utf-8')
>>> result = subprocess.run(cmd, stdout=subprocess.PIPE, input=ip)
>>> result.stdout.decode('utf-8')
'foofoo\n'
You can capture errors by passing stderr=subprocess.PIPE
(capture to result.stderr
) or stderr=subprocess.STDOUT
(capture to result.stdout
along with regular output). If you want run
to throw an exception when the process returns a nonzero exit code, you can pass check=True
. (Or you can check the returncode
attribute of result
above.) When security is not a concern, you can also run more complex shell commands by passing shell=True
as described at the end of this answer.
Later versions of Python streamline the above further. In Python 3.7+, the above one-liner can be spelled like this:
>>> subprocess.run(['ls', '-l'], capture_output=True, text=True).stdout
'total 0\n-rw-r--r-- 1 memyself staff 0 Mar 14 11:04 files\n'
Using run
this way adds just a bit of complexity, compared to the old way of doing things. But now you can do almost anything you need to do with the run
function alone.
Older versions of Python (3-3.4): more about check_output
If you are using an older version of Python, or need modest backwards compatibility, you can use the check_output
function as briefly described above. It has been available since Python 2.7.
subprocess.check_output(*popenargs, **kwargs)
It takes takes the same arguments as Popen
(see below), and returns a string containing the program's output. The beginning of this answer has a more detailed usage example. In Python 3.5+, check_output
is equivalent to executing run
with check=True
and stdout=PIPE
, and returning just the stdout
attribute.
You can pass stderr=subprocess.STDOUT
to ensure that error messages are included in the returned output. When security is not a concern, you can also run more complex shell commands by passing shell=True
as described at the end of this answer.
If you need to pipe from stderr
or pass input to the process, check_output
won't be up to the task. See the Popen
examples below in that case.
Complex applications and legacy versions of Python (2.6 and below): Popen
If you need deep backwards compatibility, or if you need more sophisticated functionality than check_output
or run
provide, you'll have to work directly with Popen
objects, which encapsulate the low-level API for subprocesses.
The Popen
constructor accepts either a single command without arguments, or a list containing a command as its first item, followed by any number of arguments, each as a separate item in the list. shlex.split
can help parse strings into appropriately formatted lists. Popen
objects also accept a host of different arguments for process IO management and low-level configuration.
To send input and capture output, communicate
is almost always the preferred method. As in:
output = subprocess.Popen(["mycmd", "myarg"],
stdout=subprocess.PIPE).communicate()[0]
Or
>>> import subprocess
>>> p = subprocess.Popen(['ls', '-a'], stdout=subprocess.PIPE,
... stderr=subprocess.PIPE)
>>> out, err = p.communicate()
>>> print out
.
..
foo
If you set stdin=PIPE
, communicate
also allows you to pass data to the process via stdin
:
>>> cmd = ['awk', 'length($0) > 5']
>>> p = subprocess.Popen(cmd, stdout=subprocess.PIPE,
... stderr=subprocess.PIPE,
... stdin=subprocess.PIPE)
>>> out, err = p.communicate('foo\nfoofoo\n')
>>> print out
foofoo
Note Aaron Hall's answer, which indicates that on some systems, you may need to set stdout
, stderr
, and stdin
all to PIPE
(or DEVNULL
) to get communicate
to work at all.
In some rare cases, you may need complex, real-time output capturing. Vartec's answer suggests a way forward, but methods other than communicate
are prone to deadlocks if not used carefully.
As with all the above functions, when security is not a concern, you can run more complex shell commands by passing shell=True
.
Notes
1. Running shell commands: the shell=True
argument
Normally, each call to run
, check_output
, or the Popen
constructor executes a single program. That means no fancy bash-style pipes. If you want to run complex shell commands, you can pass shell=True
, which all three functions support. For example:
>>> subprocess.check_output('cat books/* | wc', shell=True, text=True)
' 1299377 17005208 101299376\n'
However, doing this raises security concerns. If you're doing anything more than light scripting, you might be better off calling each process separately, and passing the output from each as an input to the next, via
run(cmd, [stdout=etc...], input=other_output)
Or
Popen(cmd, [stdout=etc...]).communicate(other_output)
The temptation to directly connect pipes is strong; resist it. Otherwise, you'll likely see deadlocks or have to do hacky things like this.
I had the same problem but figured out a very simple way of doing this:
import subprocess
output = subprocess.getoutput("ls -l")
print(output)
Hope it helps out
Note: This solution is Python3 specific as subprocess.getoutput()
doesn't work in Python2