Django中使用Celery實現異步任務隊列

本次測試是在Centos環境下,用到了redis做爲中間件

首先安裝redis,這裏將不一一結束redis的安裝。html

 

安裝完redis後,安裝celery

pip install celery

 

1、在django項目的settings中添加:redis

# Celery settings
CELERY_BROKER_URL = 'redis://localhost'
CELERY_RESULT_BACKEND = 'redis://localhost'
CELERY_IMPORTS = ("這裏導入的是須要執行方法的文件", )        # 若是不加這一行的話,會顯示任務爲註冊,查了很久也沒查出緣由,因此加上這條配置,導入方法
#: Only add pickle to this list if your broker is secured
#: from unwanted access (see userguide/security.html)

CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_SOFT_TIME_LIMIT = 240
CELERY_TASK_TIME_LIMIT = 300

CELERY_WORKER_SEND_TASK_EVENTS = True
CELERY_TASK_SEND_SENT_EVENT = True

CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
settings

 

2、在項目的app下建立celery.pydjango

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '項目名.settings')

app = Celery('項目名')
# this ‘demo’ is your project name !!!

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
celery.py

 

3、而後寫須要執行的方法:json

import time
from celery import Celery
​
celery_app = Celery('該文件路徑', backend='redis://localhost', broker='redis://localhost')
# this is celery settings
​
# this is a function about need many time
@celery_app.task
def add(a, b):
    time.sleep(5)
    return a + b
task.py

 

4、最後執行命令app

celery -A 項目名 worker --loglevel=info --pool=solo
相關文章
相關標籤/搜索