Hadoop“无法为您的平台加载本地Hadoop库”错误docker火花?

我正在使用docker-spark 。 启动spark-shell ,输出:

 15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path 15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 

这个spark container的环境variables是:

 bash-4.1# export declare -x BOOTSTRAP="/etc/bootstrap.sh" declare -x HADOOP_COMMON_HOME="/usr/local/hadoop" declare -x HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop" declare -x HADOOP_HDFS_HOME="/usr/local/hadoop" declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop" declare -x HADOOP_PREFIX="/usr/local/hadoop" declare -x HADOOP_YARN_HOME="/usr/local/hadoop" declare -x HOME="/" declare -x HOSTNAME="sandbox" declare -x JAVA_HOME="/usr/java/default" declare -x OLDPWD declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/local/spark/bin:/usr/local/hadoop/bin" declare -x PWD="/" declare -x SHLVL="3" declare -x SPARK_HOME="/usr/local/spark" declare -x SPARK_JAR="hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar" declare -x TERM="xterm" declare -x YARN_CONF_DIR="/usr/local/hadoop/etc/hadoop" 

在CentOS中引用Hadoop“无法为您的平台加载原生Hadoop库”错误后 ,我做了以下工作:

(1)检查hadoop库:

 bash-4.1# file /usr/local/hadoop/lib/native/libhadoop.so.1.1.0 /usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped 

是的,这是64-bit库。

(2)尝试添加HADOOP_OPTS环境variables:

 export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native" 

它不起作用,并报告相同的错误。

(3)尝试添加HADOOP_OPTSHADOOP_COMMON_LIB_NATIVE_DIR环境variables:

 export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" 

它仍然不起作用,并报告相同的错误。

任何人都可以提供一些关于这个问题的线索吗?

Hadoop库添加到LD_LIBRARY_PATH可解决此问题:

 export LD_LIBRARY_PATH=/usr/local/hadoop/lib/native/:$LD_LIBRARY_PATH