windows Eclipse运行mapreduce配置说明
2016-04-28 10:41
393 查看
本文就最近学习使用hadoop,基于window7的eclipse运行mapreduce程序做以下步骤总结
1.下载或自行编译hadoop-eclipse-plugin-2.2.0.jar,具体细节请网上查找资料,将此jar包拷贝到eclipse的plugins目录下,重启eclipse
2.解压hadoop.tar.gz文件至本地磁盘中,配置eclipse关联的hadoop,见eclipse->Window->preference->Data Management->Hadoop Map/Reduce中的hadoop目录,见下图
3.新建mapreduceProject ,见File->New->Map/Reduce Project,新增过程省略
4.下载winutils.exe和hadoop.dll,拷贝是window-system32下面,重启电脑,可参照:http://blog.csdn.net/cnxieyang/article/details/51272093这两个请点击下载
5.下载hadoop的NativeIO.java文件,具体可通过git获取其源码,在当前工程中新建同包同类的文件(org.apache.hadoop.io.nativeio.NativeIO.java),修改access方法返回值为true,修改后的文件如下.
public static boolean access(String path, AccessRight desiredAccess)
throws IOException {
//return access0(path, desiredAccess.accessRight());
return true;
}
用来修改windows当前进程的访问权限,可解决如下异常:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:552)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
at
a484
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:536)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)
at org.springframework.samples.hadoop.mapreduce.MyWordCount.main(MyWordCount.java:68)
6.运行hadoop提供的WordCount.java即可。
fd
1.下载或自行编译hadoop-eclipse-plugin-2.2.0.jar,具体细节请网上查找资料,将此jar包拷贝到eclipse的plugins目录下,重启eclipse
2.解压hadoop.tar.gz文件至本地磁盘中,配置eclipse关联的hadoop,见eclipse->Window->preference->Data Management->Hadoop Map/Reduce中的hadoop目录,见下图
3.新建mapreduceProject ,见File->New->Map/Reduce Project,新增过程省略
4.下载winutils.exe和hadoop.dll,拷贝是window-system32下面,重启电脑,可参照:http://blog.csdn.net/cnxieyang/article/details/51272093这两个请点击下载
5.下载hadoop的NativeIO.java文件,具体可通过git获取其源码,在当前工程中新建同包同类的文件(org.apache.hadoop.io.nativeio.NativeIO.java),修改access方法返回值为true,修改后的文件如下.
public static boolean access(String path, AccessRight desiredAccess)
throws IOException {
//return access0(path, desiredAccess.accessRight());
return true;
}
用来修改windows当前进程的访问权限,可解决如下异常:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:552)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
at
a484
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:536)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)
at org.springframework.samples.hadoop.mapreduce.MyWordCount.main(MyWordCount.java:68)
6.运行hadoop提供的WordCount.java即可。
fd
相关文章推荐
- Struts2--百度百科
- Spring MVC原理
- ZeroMQ(java)之负载均衡
- ZeroMQ(java)之Router与Dealer运行原理
- 自己动手写ORM框架-java
- Spring 3.0 注解注入详解
- Spring 注解 @Resource和@Autowired
- spring--百度百科
- Myeclipse中配置项目启动参数
- 【JAVA】对象的序列化和反序列化
- Java SPI机制
- java synchronized详解
- Java中实现文件上传下载的三种解决方案
- spring配置文件详解
- java语言实现号码归属地查询
- JAVA窗口程序之入门
- java中通过反射获取方法并且调用(getMethod和invoke深入)实践
- XAMPP和javaweb的数据库的连接中文查询数据不成功的问题
- 第83课:Scala和Java二种方式实战Spark Streaming开发
- java内存泄露分析