當用Queue進行通訊的時候,若是put進queue的時候有延遲,而get方法一直在運行的時候,若是用了while queue.qsize()!=0:來運行get則可能會出現put還沒來得及進入,get就已經去Queue中取數據,因此queue.qsize()==0,致使退出循環了,這樣後面再put進入的數據也取不到的。因而須要注意解析網頁的時候,能夠用解析列表頁的url放入隊列,因爲列表頁url一次性能夠解析多個,確定比解析詳情頁的方法要快。url
import time
from multiprocessing import Process, Queue, Manager from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor import os # from queue import Queue def put(queue): for i in range(50): queue.put(i) time.sleep(1) # 此處當put方法等待以後,還沒來得及put進入隊列,隊列就空了(數據已經被其餘進程取完了)因此進程就退出了,在put進入的數據都取不到的。 print(queue.qsize()) def get(queue): while queue.qsize()!=0: s = queue.get() print('get %s :%s' % (s, os.getpid())) print('i finished') # if __name__ == '__main__': queue = Manager().Queue(maxsize=20) executor = ProcessPoolExecutor(max_workers=5) task1 = executor.submit(put, queue) task2 = executor.submit(get, queue) task3 = executor.submit(get, queue) task4 = executor.submit(get, queue) task5 = executor.submit(get, queue) 結果: get 0 :5392 i finished i finished i finished i finished