docker工的芹菜工人将得不到正确的信息经纪人

我正在使用应用程序工厂模式创build一个瓶服务,我需要使用芹菜asynchronous任务。 我也使用docker和docker构成包含和运行的一切。 我的结构如下所示:

server | +-- manage.py +-- docker-compose.yml +-- requirements.txt +-- Dockerfile | +-- project | | | +-- api | | | +--tasks.py | | +-- __init__.py 

我的tasks.py文件如下所示:

 from project import celery_app @celery_app.task def celery_check(test): print(test) 

我打电话manage.py运行,看起来像这样:

 # manage.py from flask_script import Manager from project import create_app app = create_app() manager = Manager(app) if __name__ == '__main__': manager.run() 

而我的__init__.py看起来像这样:

 # project/__init__.py import os import json from flask_mongoalchemy import MongoAlchemy from flask_cas import CAS from flask import Flask from itsdangerous import JSONWebSignatureSerializer as JWT from flask_httpauth import HTTPTokenAuth from celery import Celery # instantiate the database and CAS db = MongoAlchemy() cas = CAS() # Auth stuff (ReplaceMe is replaced below in create_app()) jwt = JWT("ReplaceMe") auth = HTTPTokenAuth('Bearer') celery_app = Celery(__name__, broker=os.environ.get("CELERY_BROKER_URL")) def create_app(): # instantiate the app app = Flask(__name__, template_folder='client/templates', static_folder='client/static') # set config app_settings = os.getenv('APP_SETTINGS') app.config.from_object(app_settings) # Send new static files every time if debug is enabled if app.debug: app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 0 # Get the secret keys parse_secret(app.config['CONFIG_FILE'], app) celery_app.conf.update(app.config) print(celery_app.conf) # set up extensions db.init_app(app) cas.init_app(app) # Replace the secret key with the app's jwt.secret_key = app.config["SECRET_KEY"] parse_config(app.config['CONFIG_FILE']) # register blueprints from project.api.views import twist_blueprint app.register_blueprint(twist_blueprint) return app 

在我的docker-compose中,我启动一个worker并定义一些像这样的环境variables:

 version: '2.1' services: twist-service: container_name: twist-service build: . volumes: - '.:/usr/src/app' ports: - 5001:5000 # expose ports - HOST:CONTAINER environment: - APP_SETTINGS=project.config.DevelopmentConfig - DATABASE_NAME_TESTING=testing - DATABASE_NAME_DEV=dev - DATABASE_URL=twist-database - CONFIG_FILE=./project/default_config.json - MONGO_PASSWORD=user - CELERY_RESULT_BACKEND=redis://redis:6379 - CELERY_BROKER_URL=redis://redis:6379/0 - MONGO_PORT=27017 depends_on: - celery - twist-database celery: container_name: celery build: . command: celery -A project.api.tasks --loglevel=debug worker volumes: - '.:/usr/src/app' twist-database: image: mongo:latest container_name: "twist-database" environment: - MONGO_DATA_DIR=/data/db - MONGO_USER=mongo volumes: - /data/db ports: - 27017:27017 # expose ports - HOST:CONTAINER command: mongod redis: image: "redis:alpine" command: redis-server volumes: - '/redis' ports: - '6379:6379' 

但是当我运行我的docker-compose文件并生成容器时,我在芹菜工作者日志中得到了这个结果:

 [2017-07-20 16:53:06,721: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused. 

这意味着当芹菜创build时,工作人员忽略为redisconfiguration的设置,并尝试使用rabbitmq代替。 我试着改变project.api.tasks项目和project.celery_app,但无济于事。

在我看来, celery服务应该有环境variablesCELERY_RESULT_BACKENDCELERY_BROKER_URL

您需要将docker服务链接在一起。 最直接的机制是在dockerfile中添加networks部分 。