HOWTO django + celery + rabbitmq simple setup for testing purpose
2015-08-14 22:32
555 查看
This setup is simply used for testing purpose and using the official First Steps with Celery and First Steps with Django as reference.
Then create a new proj/proj/celery.py module that defines the Celery instance:
Then you need to import this app in your
Next, create a
Next, create a
IMPORTANT: Make sure the file name is
The tasks you write will probably live in reusable apps, and reusable
apps cannot depend on the project itself, so you also cannot import
your app instance directly.
The @shared_task decorator lets you create tasks without having any
concrete app instance:
Next, configure the broker and backend for Celery. To do this, since we’ve already told Celery to look for configuration in
stop rabbitmq,
check rabbitmq status,
IMPORTANT:To test your @shared_task in tasks.py, you must using “python manage.py shell” to start a python shell. That will load the configuration in proj/_init_.py
Otherwise, if you start a python shell normally using “python”, then configuration in proj/_init_ will not be read, thus @shared_task will not function properly. For example, shared_task.ready() will raise AttributeError: ‘DisabledBackend’ object has no attribute ‘_get_task_meta_for’
Installation
Step1. Install EPEL repository
Usingyum search epelto find the exact package name for your distribution. Then use
yum installto install EPEL.
Step2. Install Erlang
Erlang is required for RabbitMQ to work. Useyum install erlangto install it.
Step3. Install RabbitMQ
RabbitMQ is the message broker we chose. Useyum install rabbitmq-serverto install it.
Step4. Install Celery
The installation of Celery is very easy. Just usepip install celery
Step5. Install Django
The installation of Django is also very easy. Just usepip install django
Configuration
First, setup a very basic Django projectdjango-admin startproject proj
Then create a new proj/proj/celery.py module that defines the Celery instance:
file:proj/proj/celery.py from __future__ import absolute_import import os from celery import Celery # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings') from django.conf import settings app = Celery('proj') # Using a string here means the worker will not have to # pickle the object when using Windows. app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) @app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request))
Then you need to import this app in your
proj/proj/__init__.pymodule. This ensures that the app is loaded when Django starts so that the
@shared_taskdecorator (mentioned later) will use it:
File:proj/proj/__init__.py from __future__ import absolute_import # This will make sure the app is always imported when # Django starts so that shared_task will use this app. from .celery import app as celery_app
Next, create a
demoappapp in your
projproject, using
python manage.py startapp demoapp
Next, create a
tasks.pyin
demoapp
IMPORTANT: Make sure the file name is
tasks.py. Otherwise, it will not be autodiscovered.
The tasks you write will probably live in reusable apps, and reusable
apps cannot depend on the project itself, so you also cannot import
your app instance directly.
The @shared_task decorator lets you create tasks without having any
concrete app instance:
demoapp/tasks.py: from __future__ import absolute_import from celery import shared_task @shared_task def add(x, y): return x + y @shared_task def mul(x, y): return x * y @shared_task def xsum(numbers): return sum(numbers)
Next, configure the broker and backend for Celery. To do this, since we’ve already told Celery to look for configuration in
proj/settings.py, we will add the following to that file.
BROKER_URL = 'amqp://guest:guest@localhost:5672//' CELERY_RESULT_BACKEND = 'amqp'
Test
RabbitMQ
start rabbitmq and run it in background,rabbitmq-server -detached
stop rabbitmq,
rabbitmqctl stop
check rabbitmq status,
rabbitmqctl status
Celery
Start celery worker process,celery -A proj worker -l info
IMPORTANT:To test your @shared_task in tasks.py, you must using “python manage.py shell” to start a python shell. That will load the configuration in proj/_init_.py
Otherwise, if you start a python shell normally using “python”, then configuration in proj/_init_ will not be read, thus @shared_task will not function properly. For example, shared_task.ready() will raise AttributeError: ‘DisabledBackend’ object has no attribute ‘_get_task_meta_for’
相关文章推荐
- HUNANOJ -- Gourmet
- Mongo集群之主从复制
- HDU 4416 Good Article Good sentence(后缀数组)
- OC - ExtensionAndCategory
- 谷歌三大核心技术(一)Google File System中文版
- django重定向
- django通过url传递参数(编辑操作页面)
- django提交post请求
- django1.8模板位置的设置setting.py
- django创建工程,用命令
- 算法基础:田忌赛马问题(Golang实现)
- django-celery动态添加定时任务
- Lingo超经典案例大全
- google test
- Gonet2 游戏服务器框架解析之gRPC入门(4)
- Yandex.Algorithm Online Round 3 Sunday, June 15, 2014
- POJ 3274 Gold Balanced Lineup
- 每日一题(4)——动态规划《Introduction to Algorithms》总结篇
- Django Models的数据类型 汇总
- HDU 3820 Golden Eggs( 最小割 奇特建图)经典