CommandLib
Commandlib is a dependencyless library for calling external UNIX commands (e.g. in build scripts) in a clean, readable way.
Using method chaining, you can build up Command objects that run in a specific directory, with specified environment variables and PATHs, etc.
For simplicity's sake, the library itself only runs commands in a blocking way (all commands run to completion before continuing), although it contains hooks to run non-blocking via either icommandlib or pexpect.
Pretend 'django/manage.py':
# Pretend django "manage.py" that just prints out arguments:
import sys ; sys.stdout.write(' '.join(sys.argv[1:]))
from commandlib import Command
# Create base command
python = Command("python")
# Create command "python manage.py" that runs in the django directory
manage = python("manage.py").in_dir("django")
# Build even more specific command
dev_manage = manage.with_trailing_args("--settings", "local_settings.py")
# Run combined command
dev_manage("runserver", "8080").run()
Will output:
runserver 8080 --settings local_settings.py
Install
$ pip install commandlib
Docs
- Add directory to PATH (with_path)
- Capture output (.output())
- Easily invoke commands from one directory (CommandPath)
- Change your command's environment variables (with_env)
- Run command and don't raise exception on nonzero exit code (ignore_errors())
- Piping data in from string or file (.piped)
- Piping data out to string or file (.piped)
- Run commmands interactively using icommandlib or pexpect
- Easily invoke commands from the current virtualenv (python_bin)
Why?
Commandlib avoids the tangle of messy code that you would get using the subprocess library directly (Popen, call, check_output(), .communicate(), etc.) and the confusion that results.
It's a heavily dogfooded library.
Is subprocess really that bad?
The code will likely be longer and messier. For example, from stack overflow:
import subprocess, os
previous_directory = os.getcwd()
os.chdir("command_directory")
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
os.chdir(previous_directory)
Equivalent:
from commandlib import Command
Command(my_command).with_path("/usr/sbin:/sbin:").in_dir("command_directory").run()
Why not use Delegator instead (Kenneth Reitz's 'subprocesses for humans')?
Kenneth Reitz (author of requests "urllib2/3 for humans"), wrote a similarly inspired "subprocess for humans" called envoy. That is now deprecated and there is now a replacement called delegator, which is a very thin wrapper around subprocess.
Features delegator has which commandlib does not:
-
Delegator can chain commands, much like bash does (delegator.chain('fortune | cowsay')). Commandlib doesn't do that because while dogfooding the library I never encountered a use case where I found this to be necessary. You can, however, easily get the output of one command using .output() as a string and feed it into another using piped.from_string(string).
-
Delegator runs subprocesses in both a blocking and nonblocking way (using pexpect). commandlib only does blocking by itself but if you pip install pexpect or icommandlib it can run via either one of them.
-
Runs on windows
Features which both have:
- Ability to set environment variables.
- Ability to run pexpect process from command object.
Features which only commandlib has:
- Ability to set PATH easily.
- Ability call code from within the current virtualenv easily.
- Ability to pipe in strings or files and easily pipe out to strings or file (or file handles).
- Hook to easily run commands in from the current virtualenv.
Why not use other tools?
-
os.system(*) - only capable of running very simple bash commands.
-
sh - uses a lot of magic. Attempts to make python more like shell rather than making running commands more pythonic.
-
plumbum - similar to amoffat's sh, tries to make a sort of "bash inside python". Also has a weird way of building commands from dict syntax (grep["-v", "\.py"]).