您的位置:首页 > 大数据 > 人工智能

Hadoop Failed to set permissions of path 错误处理

2012-11-06 23:11 375 查看
nutch调试时出现的异常。

Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-4954228\.staging to 0700

 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)

 at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)

 at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)

 at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)

 at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)

 at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)

 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)

 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Unknown Source)

 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)

 at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)

 at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)

 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)

 at org.apache.nutch.util.NutchJob.waitForCompletion(NutchJob.java:50)

 at org.apache.nutch.crawl.GeneratorJob.run(GeneratorJob.java:191)

 at org.apache.nutch.crawl.Crawler.runTool(Crawler.java:68)

 at org.apache.nutch.crawl.Crawler.run(Crawler.java:152)

 at org.apache.nutch.crawl.Crawler.run(Crawler.java:250)

 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

 at org.apache.nutch.crawl.Crawler.main(Crawler.java:257)

这个是Windows下文件权限问题,在Linux下可以正常运行,不存在这样的问题。

解决方法是,修改/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue,注释掉即可(有些粗暴,在Window下,可以不用检查):

......privatestaticvoidcheckReturnValue(booleanrv,Filep,FsPermissionpermission)throwsIOException{/** if (!rv) { throw new IOException("Failed to set permissions of path: " + p + " to " + String.format("%04o", permission.toShort())); } **/}......

重新编译打包hadoop-core-1.0.2.jar,替换掉hadoop-1.0.2根目录下的hadoop-core-1.0.2.jar即可。

这里提供一份修改版的hadoop-core-1.0.2-modified.jar文件,替换原hadoop-core-1.0.2.jar即可。

替换之后,刷新项目,设置好正确的jar包依赖,现在再运行WordCountTest,即可。

成功之后,在Eclipse下刷新HDFS目录,可以看到生成了ouput2目录:

 

https://skydrive.live.com/?cid=cf7746837803bc50&id=CF7746837803BC50%211276
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: