celery beat是一個調度程序,它按期啓動任務,而後由集羣中的可用工做程序節點執行任務。html
默認狀況下,條目是從beat_schedule
設置中獲取的,但也可使用自定義存儲,例如將條目存儲在SQL數據庫中。python
必須確保一次只有一個調度程序針對一個調度任務運行,不然最終將致使重複的任務。使用集中式方法意味着時間表沒必要同步,而且服務能夠在不使用鎖的狀況下運行。redis
要按期調用任務,您必須在Beat時間表列表中添加一個條目數據庫
tasks.pyjson
from celery import Celery
from celery.schedules import crontab app = Celery('tasks', broker='pyamqp://celery:celery@192.168.0.12:5672/celery_vhost',backend='redis://localhost:6379/0') #app = Celery('tasks', backend='redis://localhost', broker='pyamqp://') app.conf.update( task_serializer='json', accept_content=['json'], # Ignore other content result_serializer='json', timezone='Asia/Shanghai', enable_utc=True, ) @app.on_after_configure.connect def setup_periodic_tasks(sender, **kwargs): # Calls test('hello') every 10 seconds. sender.add_periodic_task(10.0, test.s('hello'), name='add every 10') # Calls add(2,2) every 30 seconds sender.add_periodic_task(30.0, add.s(2,2), expires=10) # Executes every Monday morning at 7:30 a.m. sender.add_periodic_task( crontab(hour=7, minute=30, day_of_week=1), test.s('Happy Mondays!'), ) @app.task def test(arg): print(arg) @app.task def add(x, y): return x + y
Beat須要將任務的最後運行時間存儲在本地數據庫文件(默認狀況下命名爲celerybeat-schedule)中,app
所以它須要訪問權限才能在當前目錄中進行寫操做,或者能夠爲此文件指定一個自定義位置:ide
celery -A tasks beat -s /var/run/celery/celerybeat-schedule
而後在另外一個終端啓用worker ui
celery -A tasks worker -l info
能夠看見日誌:spa
[2019-10-24 14:45:53,448: INFO/ForkPoolWorker-4] Task tasks.add[e028900c-f2a3-468e-8cb8-4ae72d0e77fe] succeeded in 0.0020012762397527695s: 4
[2019-10-24 14:46:03,370: INFO/MainProcess] Received task: tasks.test[0635b276-19c9-4d76-9941-dbe9e7320a7f]
[2019-10-24 14:46:03,372: WARNING/ForkPoolWorker-6] hello
[2019-10-24 14:46:03,374: INFO/ForkPoolWorker-6] Task tasks.test[0635b276-19c9-4d76-9941-dbe9e7320a7f] succeeded in 0.0021341098472476006s: None
[2019-10-24 14:46:13,371: INFO/MainProcess] Received task: tasks.test[afcfa84c-3a3b-48bf-9191-59ea55b08eea]
[2019-10-24 14:46:13,373: WARNING/ForkPoolWorker-8] hello
[2019-10-24 14:46:13,375: INFO/ForkPoolWorker-8] Task tasks.test[afcfa84c-3a3b-48bf-9191-59ea55b08eea] succeeded in 0.002273786813020706s: None日誌
也能夠經過啓用workers -B選項將beat嵌入到worker中,
若是永遠不會運行一個以上的worker節點,這很方便,可是它並不經常使用,所以不建議用於生產環境:
celery -A tasks worker -B -l info