Django中使用Celery

(一)、概述

  • Celery是一個簡單、靈活和可靠的基於多任務的分佈式系統,爲運營提供用於維護此係統的工具。專一於實時處理的任務隊列,同時也支持任務的調度。執行單元爲任務(task),利用多線程這些任務能夠被併發的在單個或多個職程(worker)上運行。
  • Celery經過消息機制通訊,一般經過中間人(broker)來分配和調節客戶端與職程服務器(worker)之間的通訊。客戶端發送一條消息,中間人把消息分配給一個職程,最後由職程來負責執行此任務。
  • Celery能夠有多個職程和中間人,這樣提升了高可用性和橫向的擴展能力
  • Celery由python語言開發,可是該協議能夠用任何語言拉力實現,例如:Django中的Celery、node中的node-celery和php中的celery-php

(二)、Django中使用Celery的流程與配置

  • 導入Celery:pip3 install Celery
  • 與項目同名的目錄下建立celery.py文件,特別注意:項目同名的目錄下php

    • 複製內容到該文件
    • 修改兩處內容node

      • os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')中的proj改成項目名
      • app = Celery('pro')中的pro改成項目名
import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

app = Celery('pro')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print(f'Request: {self.request!r}')
  • 與項目同名的目錄下的__init__.py文件中添加內容
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)
  • 在settings.py文件中添加配置python

    • CELERY_BROKER_URL:中間人url,能夠配置redis或者RabbitMQ
    • CELERY_RESULT_BACKEND:返回結果的存儲地址
    • CELERY_ACCEPT_CONTENT:接收內容的格式,分爲兩種:json和msgpack。msgpack比json格式的數據體積更小,傳輸速度更快。
    • CELERY_TASK_SERIALIZER:任務載荷的序列化方式-->json
    • CELERY_TIMEZONE
    • CELERY_TASK_TRACK_STARTED:是否開啓任務跟蹤
    • CELERY_TASK_TIME_LIMIT:任務超時限制
# Celery配置
CELERY_BROKER_URL = env("CELERY_BROKER_URL")
CELERY_RESULT_BACKEND = env("CELERY_RESULT_BACKEND")
CELERY_ACCEPT_CONTENT = ["json", "msgpack"]
CELERY_TASK_SERIALIZER = "json"
CELERY_TIMEZONE = "Asia/Shanghai"
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
  • 在app下建立tasks.py文件,建立發送消息功能,任務方法必須添加裝飾器:@shared_task
from rest_framework.response import Response
from rest_framework.generics import GenericAPIView
from time import sleep
from celery import shared_task

class TestView3(GenericAPIView):

    @classmethod
    @shared_task
    def sleep(self, duration):
        sleep(duration)
        return Response("成功", status=200)
  • 建立視圖和路由
### views.py
from .tasks import TestView3
class TestView1(GenericAPIView):
    def get(self, request):
        TestView3.sleep(10)
        return Response("celery實驗成功")
test_view_1 = TestView1.as_view()

### urls.py
from django.urls import path
from .views import (
    test_view_1
)

urlpatterns = [
    path('celery/', test_view_1, name="test1")
]
  • 安裝redis並啓動
  • 啓動django項目
  • 使用Celery命令啓動Celery服務,命令:celery -A 項目名 worker -l info,若是以下所示則爲啓動成功.
celery@AppledeMacBook-Air.local v5.0.3 (singularity)

Darwin-20.1.0-x86_64-i386-64bit 2020-12-05 20:52:17

[config]
.> app:         drf_email_project:0x7f84a0c4ad68
.> transport:   redis://127.0.0.1:6379/1%20
.> results:     redis://127.0.0.1:6379/2
.> concurrency: 4 (prefork)
.> task events: OFF (enable -E to monitor tasks in this worker)

[queues]
.> celery           exchange=celery(direct) key=celery


[tasks]
  . drf_email_project.celery.debug_task
  . users.tasks.sleep

[2020-12-05 20:52:18,166: INFO/MainProcess] Connected to redis://127.0.0.1:6379/1%20
[2020-12-05 20:52:18,179: INFO/MainProcess] mingle: searching for neighbors
[2020-12-05 20:52:19,212: INFO/MainProcess] mingle: all alone
[2020-12-05 20:52:19,248: WARNING/MainProcess] /Users/apple/drf-email/lib/python3.7/site-packages/celery/fixups/django.py:204: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  leak, never use this setting in production environments!''')

[2020-12-05 20:52:19,249: INFO/MainProcess] celery@AppledeMacBook-Air.local ready.
相關文章
相關標籤/搜索