编译hadoop1.04 eclipse plugin插件 详解 在windows linux平台均可
2013-01-27 12:02
771 查看
编译hadoop eclipse插件 详解
一、linux下编译hadoop和eclipse-plugin操作系统为
ubuntu10.0.4 x86
1.下载软件包
hadoop-1.0.4.tar.gz
eclipse-jee-indigo-SR2-linux-gtk.tar.gz
将二者解压到/software目录下,分别命名为eclipse3.7 hadoop
即:
HADOOP_HOME=/software/hadoop
ECLIPSE_HOME=/software/eclipse3.7
2.
安装jdk ant
$ sudo apt-get install ant openjdk-6-jdk autoconf libtool
3.编辑
{HADOOP_HOME}/build.xml
3.1对31行的hadoop版本做修改
<property name="version"
value="1.0.4-SNAPSHOT"/>
修改为:
<property name="version"
value="1.0.4"/>
3.2 对2418行的ivy下载进行注释,因为已经包含了ivy.jar
<!--target name="ivy-download" description="To download ivy" unless="offline">
<get src="${ivy_repo_url}" dest="${ivy.jar}" usetimestamp="true"/>
</target-->
3.3 对2426行去除对ivy-download的依赖关系,保留如下:
<target name="ivy-init-antlib" depends="ivy-init-dirs,ivy-probe-antlib"
4.修改/hadoop-1.0.4/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue,注释掉该函数即可
(这一步非常重要,因为在windows下,使用hadoop-1.0.4自带的hadoop-core-1.0.4.jar时,会抛出权限异常:12/04/24 15:32:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable,而修改了FileUtil.java后,编译得到的hadoop-core-1.0.4.jar在win平台不会出现上述问题)
[java] view plaincopy
......
private static void checkReturnValue(boolean rv, File p,
FsPermission permission
) throws IOException {
}
......
5.编辑{HADOOP_HOME}/src/contrib./build-contrib.xml
添加高亮的两行,补充Eclipse路径和Hadoop版本
<project
name="hadoopbuildcontrib"
xmlns:ivy="antlib:org.apache.ivy.ant">
<property
name="eclipse.home" location="/software/eclipse3.7"/>
<property
name="version" value="1.0.4"/>
<property
name="name" value="${ant.project.name}"/>
<property
name="root" value="${basedir}"/>
<property
name="hadoop.root" location="${root}/../../../"/>
...
</project>
6.编译
6.1 编译hadoop
$ cd /software/hadoop
$ ant compile
这个时候会出现一些错误。其中一个就是提示你没有安装automake、autoconf、autotools这三个软件,在centos下可以通过 yum install automake autoconf autotools安装
这三个软件。
还有一个问题是:/hadoop/mapred/gridmix/Gridmix.java:396: 错误: 类型参数? extends T不在类型变量E的范围内
这个问题则需要修改/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java这个文件。
原: private <T> String getEnumValues(Enum<? extends T>[] e) {
+改: private String getEnumValues(Enum<?>[] e) {
StringBuilder sb = new StringBuilder();
String sep = "";
-原: for (Enum<? extends T> v : e) {
+改: for (Enum<?> v : e) {
sb.append(sep); sb.append(v.name()); sep = "|";
win平台下,使用编译生成的 hadoop-core-1.0.4.jar 可以避免
6.2编译eclipse-plugin
$ cd /software/hadoop/src/contrib./eclipse-plugin/
$ ang jar
执行成功后,
/software/hadoop/build/contrib./eclipse-plugin目录下会生成hadoop-eclipse-plugin-1.0.4.jar。
此时的hadoop-eclipse-plugin-1.0.4.jar还缺少一些jar包,放入${ECLIPSE_HOME}/plugins
后连接DFS会出现如下错误:
An internal error occurred during: "Connecting to DFS Hadoop".org/apache/commons/configuration/Configuration
解决方法:
对hadoop-eclipse-plugin-1.0.4.jar进行修改。
5.1.
用归档管理器打开该包,发现只有commons-cli-1.2.jar
和hadoop-core.jar两个包。
将HADOOP_HOME/lib目录下的commons-configuration-1.6.jar , commons-httpclient-3.0.1.jar , commons-lang-2.4.jar
, jackson-core-asl-1.8.8.jar 和jackson-mapper-asl-1.8.8.jar复制到hadoop-eclipse-plugin-1.0.3.jar的lib目录下
5.2. 修改该包META-INF目录下的MANIFEST.MF,将classpath修改为:
Bundle-ClassPath:classes/,lib/hadoop-core.jar,lib/commons-cli-1.2.jar,lib/commons-httpclient-3.0.1.jar,lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar
将该hadoop-eclipse-plugin-1.0.4.jar拷贝到eclipse plugs-in目录中,重新启动eclipse,至此,搭建完成hadoop
eclipse编译环境。
连接
二.Windows下编译eclipse-plugin
操作系统:
win7 32
1.下载如下软件包:
jdk-6u30-windows-i586.exe
apache-ant-1.8.4-bin.zip
eclipse-jee-indigo-SR2-win32.zip
hadoop-1.0.4.tar.gz
2.
2.1安装jdk
配置J***A_HOME,并将%J***A_HOME%\bin添加到Path变量
2.2
将apache-ant-1.8.4-bin.zip解压到
E:/software/目录下,
配置ANT_HOME变量,
将%ANT_HOME%\bin添加到Path变量
2.3
将hadoop-1.0.4.tar.gz解压到E:/software/目录下
2.4将eclipse-jee-indigo-SR2-win32.zip解压到E:/software/目录下,更名为eclipse3.7
3.修改eclipse-plugin相关的build配置文件,并准备jar包
3.1修改${HADOOP_HOME}/src/contrib/目录下的build-contrib.xml文件添加高亮的两行,补充Eclipse路径和Hadoop版本
<project
name="hadoopbuildcontrib"
xmlns:ivy="antlib:org.apache.ivy.ant">
<property
name="eclipse.home" location="E:/software/eclipse3.7"/>
<property
name="version" value="1.0.4"/>
<property
name="name" value="${ant.project.name}"/>
<property
name="root" value="${basedir}"/>
<property
name="hadoop.root" location="${root}/../../../"/>
...
</project>
3.2
修改eclipse-plugin/build.xml文件
<path id="eclipse-sdk-jars"> <fileset dir="${eclipse.home}/plugins/"> <include name="org.eclipse.ui*.jar"/> <include name="org.eclipse.jdt*.jar"/> <include name="org.eclipse.core*.jar"/> <include name="org.eclipse.equinox*.jar"/> <include name="org.eclipse.debug*.jar"/> <include name="org.eclipse.osgi*.jar"/> <include name="org.eclipse.swt*.jar"/> <include name="org.eclipse.jface*.jar"/> <include name="org.eclipse.team.cvs.ssh2*.jar"/> <include name="com.jcraft.jsch*.jar"/> </fileset> <fileset dir="../../../"> <include name="hadoop*.jar"/> </fileset> </path> |
(2)删除build.xml中的deprecation=”${javac.deprecation}”
3.3
新建目录并复制jar包
>copy %Hadoop_HOME%\lib\commons-cli-1.2.jar %HADOOP_HOME%\build\ivy\lib\Hadoop\common\
>copy %Hadoop_HOME%\hadoop-core-1.0.4.jar %Hadoop_HOME%\build\
4.
命令行:切换到 %HADOOP_HOME%\src\contrib\eclipse-plugin目录下
ant jar
在%HADOOP_HOME%\build\contrib\eclipse-plugins
目录下生成hadoop-eclipse-plugins-1.0.4.jar
此时的hadoop-eclipse-plugin-1.0.4.jar还缺少一些jar包,放入${ECLIPSE_HOME}/plugins
后连接DFS会出现如下错误:
An internal error occurred during: "Connecting to DFS Hadoop".org/apache/commons/configuration/Configuration
解决方法:
对hadoop-eclipse-plugin-1.0.3.jar进行修改。
5.1.
用归档管理器打开该包,发现只有commons-cli-1.2.jar
和hadoop-core.jar两个包。
将HADOOP_HOME/lib目录下的commons-configuration-1.6.jar , commons-httpclient-3.0.1.jar , commons-lang-2.4.jar
, jackson-core-asl-1.8.8.jar 和jackson-mapper-asl-1.8.8.jar复制到hadoop-eclipse-plugin-1.0.3.jar的lib目录下
5.2. 修改该包META-INF目录下的MANIFEST.MF,将classpath修改为:
Bundle-ClassPath:classes/,lib/hadoop-core.jar,lib/commons-cli-1.2.jar,lib/commons-httpclient-3.0.1.jar,lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar
将该hadoop-eclipse-plugin-1.0.4.jar拷贝到eclipse plugs-in目录中,重新启动eclipse,至此,搭建完成hadoop
eclipse编译环境
相关文章推荐
- windows编译hadoop 2.x Hadoop-eclipse-plugin插件
- Windows平台上使用ANT编译Hadoop Eclipse Plugin
- hadoop-2.2.0配置eclipse插件(windows和linux平台)
- windows编译hadoop 2.x Hadoop-eclipse-plugin插件
- Windows下使用Ant编译hadoop-eclipse-plugin-2.x插件
- Windows下编译hadoop 2.x Hadoop-eclipse-plugin插件
- Linux下Ant编译Hadoop-eclipse-plugin-2.x.x.jar插件
- Windows下编译hadoop 2.x Hadoop-eclipse-plugin插件
- windows编译hadoop 2.x Hadoop-eclipse-plugin插件
- windows编译hadoop 2.x Hadoop-eclipse-plugin插件
- windows编译hadoop 2.x Hadoop-eclipse-plugin插件
- windows下编译hadoop-eclipse-plugin-1.1.2.jar插件
- Windows下使用Ant编译hadoop-eclipse-plugin-2.x插件
- windows编译hadoop 2.x Hadoop-eclipse-plugin插件
- 编译hadoop eclipse插件 详解
- ant编译hadoop+eclipse对应版本的eclipse hadoop plugin插件
- Windows下使用Hadoop2.6.0-eclipse-plugin插件
- Hadoop 2.2.0编译hadoop-eclipse-plugin插件
- Windows平台下安装Eclipse插件,开发Hadoop应用
- ubuntu下hadoop eclipse插件编译+windows下eclipse中安装+连接hadoop