Python thread pool similar to the multiprocessing Pool?
2015-07-13 14:39
791 查看
Q:
Is there a Pool class for worker threads, similar to the multiprocessing module's Pool
class?
I like for example the easy way to parallelize a map function
however I would like to do it without the overhead of creating new processes.
I know about the GIL. However, in my usecase, the function will be an IO-bound C function for which the python wrapper will release the GIL before the actual function call.
Do I have to write my own threading pool?
A:
I just found out that there actually is a thread-based Pool interface in the
however it is hidden somewhat and not properly documented.
It can be imported via
It is implemented using a dummy Process class wrapping a python thread. This thread-based Process class can be found in
is mentioned briefly in the docs. This dummy module
supposedly provides the whole multiprocessing interface based on threads.
A:
In Python 3 you can use
i.e.:
See the docs for more info and examples.
A:
from Queue import Queue
from threading import Thread
class Worker(Thread):
"""Thread executing tasks from a given tasks queue"""
def __init__(self, tasks):
Thread.__init__(self)
self.tasks = tasks
self.daemon = True
self.start()
def run(self):
while True:
func, args, kargs = self.tasks.get()
try: func(*args, **kargs)
except Exception, e: print e
self.tasks.task_done()
class ThreadPool:
"""Pool of threads consuming tasks from a queue"""
def __init__(self, num_threads):
self.tasks = Queue(num_threads)
for _ in range(num_threads): Worker(self.tasks)
def add_task(self, func, *args, **kargs):
"""Add a task to the queue"""
self.tasks.put((func, args, kargs))
def wait_completion(self):
"""Wait for completion of all the tasks in the queue"""
self.tasks.join()
if __name__ == '__main__':
from random import randrange
delays = [randrange(1, 10) for i in range(100)]
from time import sleep
def wait_delay(d):
print 'sleeping for (%d)sec' % d
sleep(d)
# 1) Init a Thread pool with the desired number of threads
pool = ThreadPool(20)
for i, d in enumerate(delays):
# print the percentage of tasks placed in the queue
print '%.2f%c' % ((float(i)/float(len(delays)))*100.0,'%')
# 2) Add the task to the queue
pool.add_task(wait_delay, d)
# 3) Wait for completion
pool.wait_completion()
Is there a Pool class for worker threads, similar to the multiprocessing module's Pool
class?
I like for example the easy way to parallelize a map function
def long_running_func(p): c_func_no_gil(p) p = multiprocessing.Pool(4) xs = p.map(long_running_func, range(100))
however I would like to do it without the overhead of creating new processes.
I know about the GIL. However, in my usecase, the function will be an IO-bound C function for which the python wrapper will release the GIL before the actual function call.
Do I have to write my own threading pool?
A:
I just found out that there actually is a thread-based Pool interface in the
multiprocessingmodule,
however it is hidden somewhat and not properly documented.
It can be imported via
from multiprocessing.pool import ThreadPool
It is implemented using a dummy Process class wrapping a python thread. This thread-based Process class can be found in
multiprocessing.dummywhich
is mentioned briefly in the docs. This dummy module
supposedly provides the whole multiprocessing interface based on threads.
A:
In Python 3 you can use
concurrent.futures.ThreadPoolExecutor,
i.e.:
executor = ThreadPoolExecutor(max_workers=10) a = executor.submit(my_function)
See the docs for more info and examples.
A:
from Queue import Queue
from threading import Thread
class Worker(Thread):
"""Thread executing tasks from a given tasks queue"""
def __init__(self, tasks):
Thread.__init__(self)
self.tasks = tasks
self.daemon = True
self.start()
def run(self):
while True:
func, args, kargs = self.tasks.get()
try: func(*args, **kargs)
except Exception, e: print e
self.tasks.task_done()
class ThreadPool:
"""Pool of threads consuming tasks from a queue"""
def __init__(self, num_threads):
self.tasks = Queue(num_threads)
for _ in range(num_threads): Worker(self.tasks)
def add_task(self, func, *args, **kargs):
"""Add a task to the queue"""
self.tasks.put((func, args, kargs))
def wait_completion(self):
"""Wait for completion of all the tasks in the queue"""
self.tasks.join()
if __name__ == '__main__':
from random import randrange
delays = [randrange(1, 10) for i in range(100)]
from time import sleep
def wait_delay(d):
print 'sleeping for (%d)sec' % d
sleep(d)
# 1) Init a Thread pool with the desired number of threads
pool = ThreadPool(20)
for i, d in enumerate(delays):
# print the percentage of tasks placed in the queue
print '%.2f%c' % ((float(i)/float(len(delays)))*100.0,'%')
# 2) Add the task to the queue
pool.add_task(wait_delay, d)
# 3) Wait for completion
pool.wait_completion()
相关文章推荐
- Python动态类型的学习---引用的理解
- Python3写爬虫(四)多线程实现数据爬取
- 垃圾邮件过滤器 python简单实现
- 下载并遍历 names.txt 文件,输出长度最长的回文人名。
- install and upgrade scrapy
- Scrapy的架构介绍
- Centos6 编译安装Python
- 使用Python生成Excel格式的图片
- 让Python文件也可以当bat文件运行
- [Python]推算数独
- Python中zip()函数用法举例
- Python中map()函数浅析
- Python在CAM软件Genesis2000中的应用
- 使用Shiboken为C++和Qt库创建Python绑定
- FREEBASIC 编译可被python调用的dll函数示例
- Python 七步捉虫法
- Python实现的基于ADB的Android远程工具