一.Celery 介紹html
Celery 是一個強大的分佈式任務隊列,它能夠讓任務的執行徹底脫離主程序,甚至能夠被分配到其餘主機上運行。咱們一般使用它來實現異步任務( async task )和定時任務( crontab )。 異步任務好比是發送郵件、或者文件上傳, 圖像處理等等一些比較耗時的操做 ,定時任務是須要在特定時間執行的任務。它的架構組成以下圖:python
[以上轉自]做者:Shyllin 來源:CSDN 原文:https://blog.csdn.net/Shyllin/article/details/80940643?utm_source=copy 版權聲明:本文爲博主原創文章,轉載請附上博文連接!redis
劃重點~架構
(1). 任務模塊:包含異步任務和定時任務。異步任務在業務邏輯中觸發被髮送到任務隊列,定時任務由celery beat週期性的發往任務隊列。app
(2).celery beat:任務調度器。beat進程會讀取配置文件裏的內容(celerybeat_schedule裏設置),週期性的將配置中到期須要執行的任務發送到任務隊列。異步
(3).消息中間件broker:任務調度隊列。接收任務生產者發來的消息任務,存儲到隊列中。由於celery自己並不提供隊列服務,官方推薦rabbitMQ和redis,在這裏咱們使用redis。async
(4).任務執行單元worker:實行任務的處理單元。實時監控消息隊列,獲取隊列調度中的任務,並執行它。分佈式
(5).任務結果存儲backend:用於任務執行結果的存儲,同中間件同樣能夠用rabbitMQ和redis。
oop
二.代碼實🌰:post
(1).目錄結構
celery_learning
|----celery_config.py
|----celery_app.py
|----tasks.py
|----test.py
(2)代碼:
celery_config.py
#-*-coding=utf-8-*- from __future__ import absolute_import from celery.schedules import crontab # 中間件 BROKER_URL = 'redis://localhost:6379/6'
# 結果存儲
CELERY_RESULT_BACKEND = 'redis://:127.0.0.1:6379/5'
# 默認worker隊列 CELERY_DEFAULT_QUEUE = 'default' # 異步任務 CELERY_IMPORTS = ( "tasks" ) from datetime import timedelta # celery beat CELERYBEAT_SCHEDULE = { 'add':{ 'task':'tasks.add', 'schedule':timedelta(seconds=10), 'args':(1,12) } }
[注] backend必定要在broker以後設置,否則會報錯:
ValueError: invalid literal for int() with base 10: '127.0.0.1'
celery_app.py
from __future__ import absolute_import from celery import Celery app = Celery('celery_app') app.config_from_object('celery_config')
tasks.py
from celery_app import app @app.task(queue='default') def add(x, y): return x + y @app.task(queue='default') def sub(x, y): return x - y
test.py
import sys, os # sys.path.append(os.path.abspath('.')) sys.path.append(os.path.abspath('..')) from tasks import add def add_loop(): ret = add.apply_async((1, 2), queue='default') print(type(ret)) return ret if __name__ == '__main__': ret = add_loop() print(ret.get()) print(ret.status)
三.執行步驟:
(1).異步任務:
終端輸入 celery -A celery_app worker -Q default --loglevel=info
執行test.py
[結果以下]:
python test.py
<class 'celery.result.AsyncResult'> 3 SUCCESS
worker隊列
[2018-10-16 15:17:15,173: INFO/MainProcess] Received task: tasks.add[79bfdfc8-d6eb-44b4-b094-4355961d18b3] [2018-10-16 15:17:15,193: INFO/ForkPoolWorker-2] Task tasks.add[79bfdfc8-d6eb-44b4-b094-4355961d18b3] succeeded in 0.0133806611411s: 3
(2)定時任務:
終端1輸入 celery -A celery_app worker -Q default --loglevel=info
終端2輸入 celery -A celery_app beat
celery beat v4.2.0 (windowlicker) is starting. __ - ... __ - _ LocalTime -> 2018-10-16 15:30:57 Configuration -> . broker -> redis://localhost:6379/6 . loader -> celery.loaders.app.AppLoader . scheduler -> celery.beat.PersistentScheduler . db -> celerybeat-schedule . logfile -> [stderr]@%WARNING . maxinterval -> 5.00 minutes (300s)
[結果以下]:
[2018-10-16 15:31:27,078: INFO/MainProcess] Received task: tasks.add[80e5b7a9-f610-47b2-91ae-2de731ee58f2] [2018-10-16 15:31:27,089: INFO/ForkPoolWorker-2] Task tasks.add[80e5b7a9-f610-47b2-91ae-2de731ee58f2] succeeded in 0.00957242585719s: 13 [2018-10-16 15:31:37,078: INFO/MainProcess] Received task: tasks.add[3b03d215-139b-4072-856e-b4941c332215] [2018-10-16 15:31:37,080: INFO/ForkPoolWorker-3] Task tasks.add[3b03d215-139b-4072-856e-b4941c332215] succeeded in 0.000707183033228s: 13 [2018-10-16 15:31:47,077: INFO/MainProcess] Received task: tasks.add[2ecb1645-5dff-4647-8035-fd1fdf8a4249] [2018-10-16 15:31:47,079: INFO/ForkPoolWorker-2] Task tasks.add[2ecb1645-5dff-4647-8035-fd1fdf8a4249] succeeded in 0.000692856963724s: 13 [2018-10-16 15:31:57,079: INFO/MainProcess] Received task: tasks.add[3abac1ce-df1d-4c2f-b08f-3248c054f893] [2018-10-16 15:31:57,082: INFO/ForkPoolWorker-3] Task tasks.add[3abac1ce-df1d-4c2f-b08f-3248c054f893] succeeded in 0.00103094079532s: 13
四.celery flower
(1).查看任務歷史,任務具體參數,開始時間等信息。
(2).提供圖表和統計數據。
(3).實現全面的遠程控制功能, 包括但不限於 撤銷/終止任務, 關閉重啓 worker, 查看正在運行任務。
(4).提供一個 HTTP API , 方便集成。
終端執行:celery flower --broker=redis://localhost:6379/6
[I 181016 15:41:13 command:139] Visit me at http://localhost:5555 [I 181016 15:41:13 command:144] Broker: redis://localhost:6379/6 [I 181016 15:41:13 command:147] Registered tasks: [u'celery.accumulate', u'celery.backend_cleanup', u'celery.chain', u'celery.chord', u'celery.chord_unlock', u'celery.chunks', u'celery.group', u'celery.map', u'celery.starmap']
參考博客:
https://www.cnblogs.com/halleluyah/p/9798418.html