Python multiprocessing processes sleep after a while
Become part of the top 3% of the developers by applying to Toptal https://topt.al/25cXVn
--
Music by Eric Matyas
https://www.soundimage.org
Track title: Ominous Technology Looping
--
Chapters
00:00 Question
01:49 Accepted answer (Score 5)
02:25 Thank you
--
Full question
https://stackoverflow.com/questions/3206...
--
Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...
--
Tags
#python #multithreading #replace #multiprocessing #cpuusage
#avk47
--
Music by Eric Matyas
https://www.soundimage.org
Track title: Ominous Technology Looping
--
Chapters
00:00 Question
01:49 Accepted answer (Score 5)
02:25 Thank you
--
Full question
https://stackoverflow.com/questions/3206...
--
Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...
--
Tags
#python #multithreading #replace #multiprocessing #cpuusage
#avk47
ACCEPTED ANSWER
Score 5
As @Felipe-Lema pointed out it is a classical RTFM.
I reworked the mentioned part of the script using a multiprocessing Queue instead of a Pool and improved the runtime:
def check_files(file_list):
"""Checks and replaces lines in files
@param file_list: list of files to search
@return counter: number of occurrence """
# as much workers as CPUs are available (HT included)
workers = multiprocessing.cpu_count()
# create two queues: one for files, one for results
work_queue = Queue()
done_queue = Queue()
processes = []
# add every file to work queue
for filename in file_list:
work_queue.put(filename)
# start processes
for w in xrange(workers):
p = Process(target=worker, args=(work_queue, done_queue))
p.start()
processes.append(p)
work_queue.put('STOP')
# wait until all processes finished
for p in processes:
p.join()
done_queue.put('STOP')
# beautify results and return them
results = []
for status in iter(done_queue.get, 'STOP'):
if status is not None:
results.append(status)
return results