Python – Multiprocessing Pool inside Process time out

multiprocessingpoolprocesspythontimeout

When ever I use the following code the pool result always returns a timeout, is there something logically incorrect I am doing?

from multiprocessing import Pool, Process, cpu_count

def add(num):
  return num+1

def add_wrap(num):
  new_num = ppool.apply_async(add, [num])
  print new_num.get(timeout=3)

ppool = Pool(processes=cpu_count() )

test = Process(target=add_wrap, args=(5,)).start()

I'm aware of this bug, and would have thought that it would have been fixed in python 2.6.4?

Best Answer

You can't pass Pool objects between processes.

If you try this code, Python will raise a exception : 'NotImplementedError: pool objects cannot be passed between processes or pickled'.

from multiprocessing import Queue, Pool

q = Queue()
ppool = Pool(processes=2)                                                       
q.put([ppool])
ppool = q.get()

So if you want your code to work, just create your Pool object in the add_wrap method.

from multiprocessing import Pool, Process, cpu_count

def add(num):
  return num+1

def add_wrap(num):
  ppool = Pool(processes=cpu_count() )
  new_num = ppool.apply_async(add, [num])
  print new_num.get(timeout=3)

test = Process(target=add_wrap, args=(5,)).start()