kafka连接从docker挂载在本地主机上的日志

我有kafka连接运行在正在生成日志到/ tmp目录中的文件的docker集装箱上。 在Docker的/tmp/kafka-connect.logs中生成的日志应该被挂载到本地的/tmp2/kafka-connect.logs目录。 当我挂载在/ tmp目录时 – docker run -p 8083:8083 -v /tmp/:/tmp/ -i jdbc但在/tmp2目录失败 – docker run -p 8083:8083 -v /tmp2/:/tmp/ -i jdbc 。 / tmp2目录的权限是777特权,并以root用户身份运行。

connect-log4j.properties

 # Root logger option log4j.rootLogger = INFO, FILE, stdout log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=%d\t%p\t%m\t(%c:%L)%n # Direct log messages to stdout log4j.appender.FILE=org.apache.log4j.FileAppender log4j.appender.FILE.File=/tmp/kafka-connect.logs # Define the layout for file appender log4j.appender.FILE.layout=org.apache.log4j.PatternLayout # use a more detailed message pattern log4j.appender.FILE.layout.ConversionPattern=%d\t%p\t%m\t(%c:%L)%n log4j.logger.org.apache.zookeeper=ERROR log4j.logger.org.I0Itec.zkclient=ERROR 

Dcockerfile

 FROM confluent/platform MAINTAINER contact@confluent.io USER root COPY Test.jar /usr/local/bin/ COPY kafka-connect-docker.sh /usr/local/bin/ COPY connect-distributed.properties /usr/local/bin/ COPY connect-log4j.properties /etc/kafka/ RUN ["apt-get", "update"] RUN ["apt-get", "install", "-yq", "curl"] RUN ["chown", "-R", "confluent:confluent", "/usr/local/bin/kafka-connect-docker.sh", "/usr/local/bin/connect-distributed.properties", "/usr/local/bin/Test.jar"] RUN ["chmod", "+x", "/usr/local/bin/kafka-connect-docker.sh", "/usr/local/bin/connect-distributed.properties", "/usr/local/bin/Test.jar"] VOLUME [ "/tmp" ] EXPOSE 8083 CMD [ "/usr/local/bin/kafka-connect-docker.sh" ] 

command to run docker image (container)

 docker run -p 8083:8083 -v /tmp2/:/tmp/ -i jdbc 

看到错误

 log4j:ERROR setFile(null,true) call failed. java.io.FileNotFoundException: /tmp/kafka-connect.logs (Permission denied) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:133) at org.apache.log4j.FileAppender.setFile(FileAppender.java:294) at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165) at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307) at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172) at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104) at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842) at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768) at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.<clinit>(LogManager.java:127) at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66) at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72) at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45) at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150) at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124) at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383) at org.apache.kafka.connect.cli.ConnectDistributed.<clinit>(ConnectDistributed.java:52