您的位置:首页 > 运维架构 > Apache

hadoop错误java.io.IOException at org.apache.hadoop.mapred.pipes.OutputHandler。。。

2013-11-23 16:57 543 查看
错误日志

12/09/30 18:35:26 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
12/09/30 18:35:26 INFO util.NativeCodeLoader: Loaded the native-hadoop library
12/09/30 18:35:26 WARN snappy.LoadSnappy: Snappy native library not loaded
12/09/30 18:35:26 INFO mapred.FileInputFormat: Total input paths to process : 1
12/09/30 18:35:27 INFO mapred.JobClient: Running job: job_201209301832_0001
12/09/30 18:35:28 INFO mapred.JobClient:  map 0% reduce 0%
12/09/30 18:35:40 INFO mapred.JobClient: Task Id : attempt_201209301832_0001_m_000000_0, Status : FAILED
java.io.IOException
at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)
at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201209301832_0001_m_000000_0: Server failed to authenticate. Exiting

解决方案:
Hello everybody!

After some googling and trial-and-error, i finally found the solution to this one. I suspect that other people may also come across this problem, so i'm posting it here.

I use Hadoop-1.0.3 (the tar.gz package, not the .deb or .rpm)
In my program's Makefile i was initially using the libraries in $(HADOOP_INSTALL)/c++/Linux-amd64-64/
I actually had to recompile these from source -with a couple of tweaks before- and include the new ones instead.

So, first of all, since i'm running Slackware64 14.0, I enabled the multilib support.

Then

1. Export a variable LIB=-lcrypto. (I actually put it in /etc/profile, so that i don't have to export it every time).

2. in $(HADOOP_INSTALL)/src/c++/pipes/impl/HadoopPipes.cc add
Code:
#include <unistd.h>
3. In $(HADOOP_INSTALL)/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java, replace the two(2) lines described here.

4. In $(HADOOP_INSTALL)/src/c++/utils run
Code:
./configure
make install
5. In $(HADOOP_INSTALL)/src/c++/pipes run
Code:
./configure
make install
6. In the new Makefile, use
Code:
-I$(HADOOP_INSTALL)/src/c++/install/include
-L$(HADOOP_INSTALL)/src/c++/install/lib -lhadooputils -lhadooppipes -lcrypto -lssl -lpthread
That was it. Programs runs fine now.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: