multiprocessing - python Pool with worker Processes -
multiprocessing - python Pool with worker Processes -
i trying utilize worker pool in python using process objects. each worker (a process) initialization (takes non-trivial amount of time), gets passed series of jobs (ideally using map()
), , returns something. no communication necessary beyond that. however, can't seem figure out how utilize map() utilize worker's compute()
function.
from multiprocessing import pool, process class worker(process): def __init__(self): print 'worker started' # initialization here super(worker, self).__init__() def compute(self, data): print 'computing things!' homecoming info * info if __name__ == '__main__': # works fine worker = worker() print worker.compute(3) # workers initialized fine pool = pool(processes = 4, initializer = worker) info = range(10) # how utilize worker pool? result = pool.map(compute, data)
is job queue way go instead, or can utilize map()
?
i suggest utilize queue this.
class worker(process): def __init__(self, queue): super(worker, self).__init__() self.queue= queue def run(self): print 'worker started' # initialization here print 'computing things!' info in iter( self.queue.get, none ): # utilize info
now can start pile of these, getting work single queue
request_queue = queue() in range(4): worker( request_queue ).start() info in the_real_source: request_queue.put( info ) # sentinel objects allow clean shutdown: 1 per worker. in range(4): request_queue.put( none )
that kind of thing should allow amortize expensive startup cost across multiple workers.
python multiprocessing
Comments
Post a Comment