python multicore queue randomly hangs for no reason, despite queue size being tiny -
in python here multiprocessing setup. subclassed process method , gave queue , other fields pickling/data purposes.
this strategy works 95% of time, other 5% unknown reason queue hangs , never finishes (it's common 3 of 4 cores finish jobs , last 1 takes forever have kill job).
i aware queue's have fixed size in python, or hang. queue stores 1 character strings... id of processor, can't that.
here exact line code halts:
res = self._recv()
does have ideas? formal code below. thank you.
from multiprocessing import process, queue multiprocessing import cpu_count num_cores import codecs, cpickle class processor(process): def __init__(self, queue, elements, process_num): super(processor, self).__init__() self.queue = queue self.elements = elements self.id = process_num def job(self): ddd = [] l in self.elements: obj = ... heavy computation ... dd = {} dd['data'] = obj.data dd['meta'] = obj.meta ddd.append(dd) cpickle.dump(ddd, codecs.open( urljoin(topdir, self.id+'.txt'), 'w')) return self.id def run(self): self.queue.put(self.job()) if __name__=='__main__': processes = [] in range(0, num_cores()): q = queue() p = processor(q, divided_work(), process_num=str(i)) processes.append((p, q)) p.start() val in processes: val[0].join() key = val[1].get() storage = urljoin(topdir, key+'.txt') ddd = cpickle.load(codecs.open(storage , 'r')) .. unpack ddd process data ...
do time.sleep(0.001)
@ beginning of run()
method.
Comments
Post a Comment