Docker:无法将数据从logstash容器发送到Kafka容器

我有2个docker集装箱,1个运行Logstash,另一个运行Zookeeper和Kafka。 我试图将数据从Logstash发送到Kafka,但似乎无法将数据传到我在Kafka的主题中。

我可以login到Docker Kafka容器,并从terminal向我的主题发送消息,然后将其消耗。

我正在使用输出kafka插件:

output { kafka { topic_id => "MyTopicName" broker_list => "kafkaIPAddress:9092" } } 

运行docker inspect kafka2

当我运行./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf我得到这个错误。

 Settings: Default pipeline workers: 4 Unknown setting 'broker_list' for kafka {:level=>:error} Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error} stopping pipeline {:id=>"main"} 

我通过运行下面的命令返回OK检查文件的configuration。

  ./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf Configuration OK 

有没有人遇到过这个问题,我是不是打开kafka容器上的端口,如果是这样的话,我怎样才能保持卡夫卡运行?

错误在这里broker_list => "kafkaIPAddress:9092"

如果在不同的机器上有容器,请将bootstrap_servers => "KafkaIPAddress:9092"映射到主机9092并使用主机地址:端口,如果在同一主机上使用内部的Docker IP:port