Broken pipe error with multiprocessing.Queue

When You fire up Queue.put(), implicit thread is started to deliver data to a queue. Meanwhile, main application is finished and there is no ending station for the data (queue object is garbage-collected).

I would try this:

from multiprocessing import Queue

def main():
    q = Queue()
    for i in range(10):
        print i
        q.put(i)
    q.close()
    q.join_thread()

if __name__ == "__main__":
    main()

join_thread() ensures, all data in the buffer has been flushed. close() must be called before join_thread()


EDIT : please use @Peter Svac answer, which is better. Usage of join_thread is ensuring the Queue does its job in a much much better fashion than the time.sleep(0.1) I proposed.

What happens here is that when you call main(), it creates the Queue, put 10 objects in it and ends the function, garbage collecting all of its inside variables and objects, including the Queue. BUT you get this error because you are still trying to send the last number in the Queue.

from the documentation documentation :

"When a process first puts an item on the queue a feeder thread is started which transfers objects from a buffer into the pipe."

As the put() is made in another Thread, it is not blocking the execution of the script, and allows to ends the main() function before completing the Queue operations.

Try this :

#!/usr/bin/python
# -*- coding: utf-8 -*-

import multiprocessing
import time
def main():
    q = multiprocessing.Queue()
    for i in range(10):
        print i
        q.put(i)
    time.sleep(0.1) # Just enough to let the Queue finish

if __name__ == "__main__":
    main()

There should be a way to join the Queue or block execution until the object is put in the Queue, you should take a look in the documentation.