Docker容器中的HDFS数据节点连接错误

我正在尝试使用我的开发机器提交一个spark工作。 Spark正在Docker容器中运行,出现以下错误。 我使用-p 50010:50010作为docker run命令的一部分,但仍然没有运气。

有没有其他的设置需要设置来启用连接到数据节点?

> 16/02/04 09:15:10 INFO DFSClient: Exception in createBlockOutputStream > org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout > while waiting for channel to be ready for connect. ch : > java.nio.channels.SocketChannel[connection-pending > remote=/178.19.0.2:50010] at > org.apache.hadoop.net.NetUtils.connect(NetUtils.java:533) at > org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1610) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1408) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1361) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:588) > 16/02/04 09:15:10 INFO DFSClient: Abandoning > BP-1937503393-178.19.1.10-1440740598640:blk_1073741858_1035 16/02/04 > 09:15:10 INFO DFSClient: Excluding datanode 178.19.0.2:50010