Asynchronous background processes in Python?

Posted by Geuis on Stack Overflow See other posts from Stack Overflow or by Geuis
Published on 2010-03-23T00:40:48Z Indexed on 2010/03/23 1:01 UTC
Read the original article Hit count: 400

I have been using this as a reference, but not able to accomplish exactly what I need: http://stackoverflow.com/questions/89228/how-to-call-external-command-in-python/92395#92395

I also was reading this: http://www.python.org/dev/peps/pep-3145/

For our project, we have 5 svn checkouts that need to update before we can deploy our application. In my dev environment, where speedy deployments are a bit more important for productivity than a production deployment, I have been working on speeding up the process.

I have a bash script that has been working decently but has some limitations. I fire up multiple 'svn updates' with the following bash command:

(svn update /repo1) & (svn update /repo2) & (svn update /repo3) &

These all run in parallel and it works pretty well. I also use this pattern in the rest of the build script for firing off each ant build, then moving the wars to Tomcat.

However, I have no control over stopping deployment if one of the updates or a build fails.

I'm re-writing my bash script with Python so I have more control over branches and the deployment process.

I am using subprocess.call() to fire off the 'svn update /repo' commands, but each one is acting sequentially. I try '(svn update /repo) &' and those all fire off, but the result code returns immediately. So I have no way to determine if a particular command fails or not in the asynchronous mode.

import subprocess

subprocess.call( 'svn update /repo1', shell=True )
subprocess.call( 'svn update /repo2', shell=True )
subprocess.call( 'svn update /repo3', shell=True )

I'd love to find a way to have Python fire off each Unix command, and if any of the calls fails at any time the entire script stops.

© Stack Overflow or respective owner

Related posts about python

Related posts about asynchronous