芹菜不适用于AWS ECS

我使用docker将我的django项目部署到了​​AWS ECS服务。 而要使用芹菜,我把rabbitmq设置为一个单独的ec2服务器(两个ec2带有brocker和后端结果)。

问题是芹菜工人在本地工作,而不是在AWS上工作。 当我在本地inputdocker run -rm -it -p 8080: 80 proj命令时,worker正在工作。

但是当我在ECS上部署应用程序时,工作人员不工作。 所以我必须在我的本地django项目中用celery -A mysite worker -l INFO创build一个工作者celery -A mysite worker -l INFO 。 尽pipe设置了主pipe人员来pipe理员工。

以下是我的代码。

Dockerfile

 FROM ubuntu:16.04 # A layer is created for each command ex)RUN, ENV, COPY, etc... RUN apt-get -y update RUN apt-get -y install python3 python3-pip RUN apt-get -y install nginx RUN apt-get -y install python-dev libpq-dev RUN apt-get -y install supervisor WORKDIR /srv RUN mkdir app COPY . /srv/app WORKDIR /srv/app RUN pip3 install -r requirements.txt RUN pip3 install uwsgi ENV DEBUG="False" \ STATIC="s3" \ REGION="Tokyo" COPY .conf/uwsgi-app.ini /etc/uwsgi/sites/app.ini COPY .conf/nginx.conf /etc/nginx/nginx.conf COPY .conf/nginx-app.conf /etc/nginx/sites-available/app.conf COPY .conf/supervisor-app.conf /etc/supervisor/conf.d/ COPY .conf/docker-entrypoint.sh / RUN ln -s /etc/nginx/sites-available/app.conf /etc/nginx/sites-enabled/app.conf EXPOSE 80 CMD supervisord -n ENTRYPOINT ["/docker-entrypoint.sh"] 

supervior-app.conf

 [program:uwsgi] command = uwsgi --ini /etc/uwsgi/sites/app.ini [program:nginx] command = nginx [program:celery] directory = /srv/app/django_app/ command = celery -A mysite worker -l INFO --concurrency=6 numprocs=1 stdout_logfile=/var/log/celery-worker.log stderr_logfile=/var/log/celery-worker.err autostart=true autorestart=true startsecs=10 ; Need to wait for currently executing tasks to finish at shutdown. ; Increase this if you have very long running tasks. stopwaitsecs = 600 ; When resorting to send SIGKILL to the program to terminate it ; send SIGKILL to its whole process group instead, ; taking care of its children as well. killasgroup=true ; if rabbitmq is supervised, set its priority higher ; so it starts first priority=998 

settings.py

 # CELERY STUFF BROKER_URL = 'amqp://user:password@example.com//' CELERY_RESULT_BACKEND = 'amqp://user:password@example.com//' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' 

celery.py

 import os from celery import Celery from django.conf import settings # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings') app = Celery('mysite') # Using a string here means the worker will not have to # pickle the object when using Windows. app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) @app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request)) 

tasks.py

 from celery.task import task from celery.utils.log import get_task_logger from .helpers import send_password_email logger = get_task_logger(__name__) @task(name="send_password_email_task") def send_password_email_task(email, password): """Send an email when user when a user requests to find a password""" logger.info("Sent feedback email") return send_password_email(email, password) 

添加Nginx代码…

nginx.conf

 user root; worker_processes auto; pid /run/nginx.pid; include /etc/nginx/modules-enabled/*.conf; daemon off; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; # server_tokens off; server_names_hash_bucket_size 512; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # SSL Settings ## ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE ssl_prefer_server_ciphers on; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; gzip_disable "msie6"; # gzip_vary on; # gzip_proxied any; # gzip_comp_level 6; # gzip_buffers 16 8k; # gzip_http_version 1.1; # gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript; ## # Virtual Host Configs ## include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } #mail { # # See sample authentication script at: # # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript # # # auth_http localhost/auth.php; # # pop3_capabilities "TOP" "USER"; # # imap_capabilities "IMAP4rev1" "UIDPLUS"; # # server { # listen localhost:110; # protocol pop3; # proxy on; # } # # server { # listen localhost:143; # protocol imap; # proxy on; # } #} 

nginx的-app.conf

 server { listen 80; server_name localhost ~^(.+)$; charset utf-8; client_max_body_size 128M; location / { uwsgi_pass unix:///tmp/app.sock; include uwsgi_params; } } 

ACL入站规则

ACL入站规则图像

任务定义

ECS任务定义捕获

如果你想在ECS中运行一个celery worker默认情况下容器是由root用户执行的,而在root环境下使用worker你应该设置一个env var来完成这个工作:

http://docs.celeryproject.org/en/latest/userguide/daemonizing.html

在任务定义中定义variables时,设置C_FORCE_ROOT = true。

解决问题太简单了。 问题是我把代理节点设置为localhost。 应用程序代码没有问题。