您的位置:首页 > 运维架构 > Shell

spark集群进入 bin 下面目录./spark-shell 出现Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

2015-03-24 18:41 846 查看
spark集群启动的时候可以正常,进入 ./spark-shell 就会出现如下错误

配置文件:spark-env.sh


export JAVA_HOME=/usr/java/jdk1.7.0_51
export SCALA_HOME=/home/hadoop/scala-2.11.6

export SPARK_MASTER_IP=master24
export SPARK_MASTER_PORT=17077
export SPARK_MASTER_WEBUI_PORT=18080

export SPARK_WORKER_CORES=1
export SPARK_WORKER_MEMORY=30g
export SPARK_WORKER_WEBUI_PORT=18081
export SPARK_WORKER_INSTANCES=1INFO SparkEnv: Registering BlockManagerMaster

错误信息如下:
15/03/24 18:32:03 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150324183203-6f8e
15/03/24 18:32:03 INFO MemoryStore: MemoryStore started with capacity 294.9 MB.
15/03/24 18:32:03 INFO ConnectionManager: Bound socket to port 35104 with id = ConnectionManagerId(server2,35104)
15/03/24 18:32:03 INFO BlockManagerMaster: Trying to register BlockManager
15/03/24 18:32:03 INFO BlockManagerInfo: Registering block manager server2:35104 with 294.9 MB RAM
15/03/24 18:32:03 INFO BlockManagerMaster: Registered BlockManager
15/03/24 18:32:03 INFO HttpServer: Starting HTTP Server
15/03/24 18:32:03 INFO HttpBroadcast: Broadcast server started at http://192.168.1.24:41483 15/03/24 18:32:03 INFO HttpFileServer: HTTP File server directory is /tmp/spark-524059df-53c2-4df8-a2a0-c76c878a3d94
15/03/24 18:32:03 INFO HttpServer: Starting HTTP Server
15/03/24 18:32:03 INFO SparkUI: Started SparkUI at http://server12:4040 15/03/24 18:32:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster1
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:418)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:231)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:139)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:510)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: cluster1
... 61 more

Spark context available as sc.

scala>

是什么原因呢?
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐