Skip to main content

How To Backport Multiprocessing to 2.4 and 2.5

Just let these guys do it for you.

My hats off to them for this contribution to the community. It is much appreciated and will find use quickly, I'm sure. I know I have some room for it in my toolbox. Hopefully, the changes will be taken back to the 2.6 line so that any bugfixes that come will help stock Python and the backport.

So, if you don't follow 2.6/3.0 development you might not be aware of multiprocessing, the evolution of integrating the pyprocessing module into the standard library. It was cleaned up and improved as part of its inclusion, so its really nice to have the result available to the larger Python user base that is still on 2.5 and 2.4. Although some edge cases might still need to be covered, the work is stable quickly.

Here's an overview incase you don't know, so hopefully you can see if it would be useful for any of your own purposes. I think, starting out, there is more potential for this backport than the original multiprocessing module. Thus, I hope this introduction is found useful by a few people.

>>> from multiprocessing import Process, Pipe
>>>
>>> def f(conn):
...     conn.send([42, None, 'hello'])
...     conn.close()
...
>>> parent_conn, child_conn = Pipe()
>>> p = Process(target=f, args=(child_conn,))
>>> p.start()
>>> print parent_conn.recv()   # prints "[42, None, 'hello']"
[42, None, 'hello']
>>> p.join()

This is an example from the multiprocessing docs, utilizing its Pipe abstraction. The original idea was emulating the threading model. The provisions are basic, but give you what you need to coordinate other Python interpreters. Aside from pipes, there are also queues, locks, and worker pools provided. If you're working on a multicore system with a problem that can be broken up for multiple workers, you can stop complaining about the GIL and dispatch your work out to child processes. Its a great solution and this makes it a lot easier, giving the anti-thread crowd a nice boost in validation and ease-of-convincing. That's a good thing for all of us, because it means software that takes advantage of our new machines and more people who can write that software without the problems threading always gave us. Of course, some problems, like locks, can be problematic in the wrong situation, so don't think I'm calling anything a silver bullet. The point is, it improves. Nothing perfects, and I know that.

Comments

Jesse said…
It wasn't that much of a contribution!

In reality, the multiprocessing back port is simple a revision of pyprocessing (original project:http://pyprocessing.berlios.de/) which was included in 2.6. We wanted to make it available with the updated docs/apis and tests. A big drawback is that the stability of the 2.6 trunk version of multiprocessing relies off of changes to python-core which were not in 2.4/2.5 for stability.

Thanks for the plug :) There's a lot of work still to be done, and as recent traffic on the python-list shows, there's still some education and improvements that could still be done as well.

I will be doing a talk on the new package and threaded programming at pyworks in atlanta in november, and hopefully a talk at pycon 2009.
Anonymous said…
What is a good way to communicate with foreign systems which you wish to share processing in addition to your multicore box you are running multiprocessing goodness on?

What are some things to avoid? What are good guidelines (if any yet) to integrate the solutions?

Popular posts from this blog

CARDIAC: The Cardboard Computer

I am just so excited about this. CARDIAC. The Cardboard Computer. How cool is that? This piece of history is amazing and better than that: it is extremely accessible. This fantastic design was built in 1969 by David Hagelbarger at Bell Labs to explain what computers were to those who would otherwise have no exposure to them. Miraculously, the CARDIAC (CARDboard Interactive Aid to Computation) was able to actually function as a slow and rudimentary computer.  One of the most fascinating aspects of this gem is that at the time of its publication the scope it was able to demonstrate was actually useful in explaining what a computer was. Could you imagine trying to explain computers today with anything close to the CARDIAC? It had 100 memory locations and only ten instructions. The memory held signed 3-digit numbers (-999 through 999) and instructions could be encoded such that the first digit was the instruction and the second two digits were the address of memory to operate on

The Range of Content on Planet Python

I've gotten a number of requests lately to contribute only Python related material to the Planet Python feeds and to be honest these requests have both surprised and insulted me, but they've continued. I am pretty sure they've come from a very small number of people, but they have become consistent. This is probably because of my current habit of writing about NaNoWriMo every day and those who aren't interested not looking forward to having the rest of the month reading about my novel. Planet Python will be getting a feed of only relevant posts in the future, but I'm going to be honest: I am kind of upset about it. I don't care if anyone thinks it is unreasonable of me to be upset about it, because the truth is Planet Python means something to me. It was probably the first thing I did that I considered "being part of the community" when I submitted my meager RSS feed to be added some seven years ago. My blog and my name on the list of authors at Plan

Pythonic Defined

Introduction Losing is Good Strings Dictionaries Conclusion Introduction Veterans and novices alike of Python will hear the term "pythonic" thrown around, and even a number of the veterans don't know what it means. There are times I do not know what it means, but that doesn't mean I can define a pretty good idea of what "pythonic" really means. Now, it has been defined at times as being whatever the BDFL decides, but we'll pull that out of the picture. I want to talk about what the word means for us today, and how it applied to what we do in the real world. Languages have their strengths and their idioms (ways of doing things), and when you exploit those you embrace the heart of that language. You can often tell when a programmer writing in one language is actually more comfortable with another, because the code they right is telltale of the other language. Java developers are notorious for writing Java in every language they get their hands on. Ho