Dead simple example of using Multiprocessing Queue, Pool and Locking
2015-07-13 14:17
731 查看
Q:
I tried to read the documentation at http://docs.python.org/dev/library/multiprocessing.html but
I'm still struggling with multiprocessing Queue, Pool and Locking. And for now I was able to build the example below.
Regarding Queue and Pool, I'm not sure if I understood the concept in the right way, so correct me if I'm wrong. What I'm trying to achieve is to process 2 requests at time ( data list have 8 in this example ) so, what should I use? Pool to create 2 processes
that can handle two different queues ( 2 at max ) or should I just use Queue to process 2 inputs each time? The lock would be to print the outputs correctly.
Thanks in advance.
A:
The best solution for your problem is to utilize a
Using
and having a separate "queue feeding" functionality is probably overkill.
Here's a slightly rearranged version of your program, this time with only 2 processes coralled in a
I believe it's the easiest way to go, with minimal changes to original code:
Note that
now accepts a single argument (a tuple of the two previous arguments) because the
chunks up your input data into sublists, each sublist given as a single argument to your worker function.
Output:
Edit as per @Thales comment below:
If you want "a lock for each pool limit" so that your processes run in tandem pairs, ala:
A waiting B waiting | A done , B done | C waiting , D waiting | C done, D done | ...
then change the handler function to launch pools (of 2 processes) for each pair of data:
Now your output is:
I tried to read the documentation at http://docs.python.org/dev/library/multiprocessing.html but
I'm still struggling with multiprocessing Queue, Pool and Locking. And for now I was able to build the example below.
Regarding Queue and Pool, I'm not sure if I understood the concept in the right way, so correct me if I'm wrong. What I'm trying to achieve is to process 2 requests at time ( data list have 8 in this example ) so, what should I use? Pool to create 2 processes
that can handle two different queues ( 2 at max ) or should I just use Queue to process 2 inputs each time? The lock would be to print the outputs correctly.
import multiprocessing import time data = (['a', '2'], ['b', '4'], ['c', '6'], ['d', '8'], ['e', '1'], ['f', '3'], ['g', '5'], ['h', '7'] ) def mp_handler(var1): for indata in var1: p = multiprocessing.Process(target=mp_worker, args=(indata[0], indata[1])) p.start() def mp_worker(inputs, the_time): print " Processs %s\tWaiting %s seconds" % (inputs, the_time) time.sleep(int(the_time)) print " Process %s\tDONE" % inputs if __name__ == '__main__': mp_handler(data)
Thanks in advance.
A:
The best solution for your problem is to utilize a
Pool.
Using
Queues
and having a separate "queue feeding" functionality is probably overkill.
Here's a slightly rearranged version of your program, this time with only 2 processes coralled in a
Pool.
I believe it's the easiest way to go, with minimal changes to original code:
import multiprocessing import time data = ( ['a', '2'], ['b', '4'], ['c', '6'], ['d', '8'], ['e', '1'], ['f', '3'], ['g', '5'], ['h', '7'] ) def mp_worker((inputs, the_time)): print " Processs %s\tWaiting %s seconds" % (inputs, the_time) time.sleep(int(the_time)) print " Process %s\tDONE" % inputs def mp_handler(): p = multiprocessing.Pool(2) p.map(mp_worker, data) if __name__ == '__main__': mp_handler()
Note that
mp_worker()function
now accepts a single argument (a tuple of the two previous arguments) because the
map()function
chunks up your input data into sublists, each sublist given as a single argument to your worker function.
Output:
Processs a Waiting 2 seconds Processs b Waiting 4 seconds Process a DONE Processs c Waiting 6 seconds Process b DONE Processs d Waiting 8 seconds Process c DONE Processs e Waiting 1 seconds Process e DONE Processs f Waiting 3 seconds Process d DONE Processs g Waiting 5 seconds Process f DONE Processs h Waiting 7 seconds Process g DONE Process h DONE
Edit as per @Thales comment below:
If you want "a lock for each pool limit" so that your processes run in tandem pairs, ala:
A waiting B waiting | A done , B done | C waiting , D waiting | C done, D done | ...
then change the handler function to launch pools (of 2 processes) for each pair of data:
def mp_handler(): subdata = zip(data[0::2], data[1::2]) for task1, task2 in subdata: p = multiprocessing.Pool(2) p.map(mp_worker, (task1, task2))
Now your output is:
Processs a Waiting 2 seconds Processs b Waiting 4 seconds Process a DONE Process b DONE Processs c Waiting 6 seconds Processs d Waiting 8 seconds Process c DONE Process d DONE Processs e Waiting 1 seconds Processs f Waiting 3 seconds Process e DONE Process f DONE Processs g Waiting 5 seconds Processs h Waiting 7 seconds Process g DONE Process h DONE
相关文章推荐
- Python动态类型的学习---引用的理解
- Python3写爬虫(四)多线程实现数据爬取
- 垃圾邮件过滤器 python简单实现
- 下载并遍历 names.txt 文件,输出长度最长的回文人名。
- install and upgrade scrapy
- Scrapy的架构介绍
- Centos6 编译安装Python
- 使用Python生成Excel格式的图片
- 让Python文件也可以当bat文件运行
- [Python]推算数独
- Python中zip()函数用法举例
- Python中map()函数浅析
- Python在CAM软件Genesis2000中的应用
- 使用Shiboken为C++和Qt库创建Python绑定
- FREEBASIC 编译可被python调用的dll函数示例
- Python 七步捉虫法
- Python实现的基于ADB的Android远程工具