Python: running multiple processes simultaneously

I think what is happening is that you are not doing enough in some_function to observe work happening in parallel. It spawns a process, and it completes before the next one gets spawned. If you introduce a random sleep time into some_function, you'll see that they are in fact running in parallel.

from multiprocessing import Process
import random
import time

def some_function(first, last):
    time.sleep(random.randint(1, 3))
    print first, last

processes = []

for m in range(1,16):
   n = m + 1
   p = Process(target=some_function, args=(m, n))
   p.start()
   processes.append(p)

for p in processes:
   p.join()

Output

2 3
3 4
5 6
12 13
13 14
14 15
15 16
1 2
4 5
6 7
9 10
8 9
7 8
11 12
10 11

Are you sure? I just tried it and it worked for me; the results are out of order on every execution, so they're being executed concurrently.

Have a look at your function. It takes "first" and "last", so is its execution time smaller for lower values? In this case, you could expect the smaller numbered arguments to make runtime lower, so it would appear to run in parallel.

ps ux | grep python | grep -v grep | wc -l
> 16

If you execute the code repeatedly (i.e. using a bash script) you can see that every process is starting up. If you want to confirm this, import os and have the function print out os.getpid() so you can see they have a different process ID.

So yeah, double check your results because it seems to me like you've written it concurrently just fine!

Tags:

Python