1.在前一篇文章 python進程Process與線程threading區別 中講到線程threading共享內存地址,進程與進程Peocess之間相互獨立,互不影響(至關於深拷貝);python
2.在線程間通訊的時候可使用Queue模塊完成,進程間通訊也能夠經過Queue完成,可是此Queue並不是線程的Queue,進程間通訊Queue是將數據 pickle 後傳給另外一個進程的 Queue,用於父進程與子進程之間的通訊或同一父進程的子進程之間通訊;git
1github 2微信 3dom 4ide 5測試 |
#導入線程相關模塊url import threadingspa import queue 線程
q = queue.Queue() |
1 2 3 4 5 |
# 導入進程相關模塊 from multiprocessing import Process from multiprocessing import Queue
q = Queue() |
1 2 3 4 5 |
# 導入進程相關模塊 from multiprocessing import Process from multiprocessing import Pipe
pipe = Pipe() |
python提供了多種進程通訊的方式,主要Queue和Pipe這兩種方式,Queue用於多個進程間實現通訊,Pipe用於兩個進程的通訊;
put():以插入數據到隊列中,他還有兩個可選參數:blocked和timeout。詳情自行百度
get():從隊列讀取而且刪除一個元素。一樣,他還有兩個可選參數:blocked和timeout。詳情自行百度
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
# !usr/bin/env python # -*- coding:utf-8 _*- """ @Author:何以解憂 @Blog(我的博客地址): shuopython.com @WeChat Official Account(微信公衆號):猿說python @Github:www.github.com
@File:python_process_queue.py @Time:2019/12/21 21:25
@Motto:不積跬步無以致千里,不積小流無以成江海,程序人生的精彩須要堅持不懈地積累! """
from multiprocessing import Process from multiprocessing import Queue import os,time,random
#寫數據進程執行的代碼 def proc_write(q,urls): print ('Process is write....') for url in urls: q.put(url) print ('put %s to queue... ' %url) time.sleep(random.random())
#讀數據進程的代碼 def proc_read(q): print('Process is reading...') while True: url = q.get(True) print('Get %s from queue' %url)
if __name__ == '__main__': #父進程建立Queue,並傳給各個子進程 q = Queue() proc_write1 = Process(target=proc_write,args=(q,['url_1','url_2','url_3'])) proc_write2 = Process(target=proc_write,args=(q,['url_4','url_5','url_6'])) proc_reader = Process(target=proc_read,args=(q,)) #啓動子進程,寫入 proc_write1.start() proc_write2.start()
proc_reader.start() #等待proc_write1結束 proc_write1.join() proc_write2.join() #proc_raader進程是死循環,強制結束 proc_reader.terminate() print("mian") |
輸出結果:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
Process is write.... put url_1 to queue... Process is write.... put url_4 to queue... Process is reading... Get url_1 from queue Get url_4 from queue put url_5 to queue... Get url_5 from queue put url_2 to queue... Get url_2 from queue put url_3 to queue... Get url_3 from queue put url_6 to queue... Get url_6 from queue mian |
Pipe經常使用於兩個進程,兩個進程分別位於管道的兩端 * Pipe方法返回(conn1,conn2)表明一個管道的兩個端,Pipe方法有duplex參數,默認爲True,即全雙工模式,若爲FALSE,conn1只負責接收信息,conn2負責發送,Pipe一樣也包含兩個方法:
send() : 發送信息;
recv() : 接收信息;
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
from multiprocessing import Process from multiprocessing import Pipe import os,time,random #寫數據進程執行的代碼 def proc_send(pipe,urls): #print 'Process is write....' for url in urls:
print ('Process is send :%s' %url) pipe.send(url) time.sleep(random.random())
#讀數據進程的代碼 def proc_recv(pipe): while True: print('Process rev:%s' %pipe.recv()) time.sleep(random.random())
if __name__ == '__main__': #父進程建立pipe,並傳給各個子進程 pipe = Pipe() p1 = Process(target=proc_send,args=(pipe[0],['url_'+str(i) for i in range(10) ])) p2 = Process(target=proc_recv,args=(pipe[1],)) #啓動子進程,寫入 p1.start() p2.start()
p1.join() p2.terminate() print("mian") |
輸出結果:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
Process is send :url_0 Process rev:url_0 Process is send :url_1 Process rev:url_1 Process is send :url_2 Process rev:url_2 Process is send :url_3 Process rev:url_3 Process is send :url_4 Process rev:url_4 Process is send :url_5 Process is send :url_6 Process is send :url_7 Process rev:url_5 Process is send :url_8 Process is send :url_9 Process rev:url_6 mian |
固然咱們也能夠嘗試使用線程threading的Queue是否能完成線程間通訊,示例代碼以下:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
from multiprocessing import Process # from multiprocessing import Queue # 進程間通訊Queue,二者不要混淆 import queue # 線程間通訊queue.Queue,二者不要混淆 import time
def p_put(q,*args): q.put(args) print('Has put %s' % args)
def p_get(q,*args): print('%s wait to get...' % args)
print(q.get()) print('%s got it' % args)
if __name__ == "__main__": q = queue.Queue() p1 = Process(target=p_put, args=(q,'p1', )) p2 = Process(target=p_get, args=(q,'p2', )) p1.start() p2.start() |
直接異常報錯:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
Traceback (most recent call last): File "E:/Project/python_project/untitled10/123.py", line 38, in <module> p1.start() File "G:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 105, in start self._popen = self._Popen(self) File "G:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "G:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen return Popen(process_obj) File "G:\ProgramData\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__ reduction.dump(process_obj, to_child) File "G:\ProgramData\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) TypeError: can't pickle _thread.lock objects |
2.python進程Process與線程threading區別
轉載請註明:猿說Python » python 進程間通訊Queue