localhost: Warning: $HADOOP_HOME is deprecated.问题解决
2013-04-10 11:00
411 查看
启动HADOOP或是停止时,都会输出这个WARNING。
解决办法如下:
http://thysmichels.com/2012/02/11/tips-running-hadoop-on-ubuntu/
When you get this Warning: $HADOOP_HOME is deprecated
Solution: add “export HADOOP_HOME_WARN_SUPPRESS=”TRUE”" in the hadoop-env.sh.
~~~~~~~~~~~~
顺带介绍其它常见错误及解决方法:
When you get this Warning: $HADOOP_HOME is deprecated
Solution: add “export HADOOP_HOME_WARN_SUPPRESS=”TRUE”" in the hadoop-env.sh.
Cannot create directory `/usr/local/hadoop/libexec/../logs
Solution: sudo chown -R hduser:hadoop /usr/local/hadoop/
Enter passphrase when running ./start-all.sh
Solution: ssh-keygen -t rsa -P “” Create a ssh-key without a password.
Warning: <property>/<configuration> not set
Solution: make sure <property> and <configuration> tags are populated in core.site.xml, mapred.site.xml, hdfs.site.xml
Send or retrieve file to and from HDFS
Solution:
Send file to HDFS > bin/hadoop dfs -put /home/someone/interestingFile.txt /user/yourUserName/
Get file from HDFS > bin/hadoop dfs -get foo localFile
ssh: connect to host localhost port 22: Connection refused
Solution: By default Ubuntu will not have ssh installed so you will have to install and start it.
Install > sudo apt-get install ssh
Start > sudo service ssh start
hadoop Incompatible namespaceIDs in /app/hadoop/tmp/*
Solution:
Stop the cluster: ./stop-dfs.sh
Delete the directory specified on the DataNode: rm -r /app/hadoop/tmp/*
Reformat the NameNode: hadoop namenode -format
OR
Stop the DataNode: ./stop.dfs.sh
Edit the value of namespaceID in /current/VERSION to match the value of the current NameNode.
Restart the DataNode: ./start.dfs.sh
hadoop java.net.UnknownHostException: ubuntu: ubuntu
Solution:
1. Add ubuntu as your localhost IP to your /etc/hosts file: sudo vi /etc/hosts
2. Restart your network: sudo /etc/init.d/networking restart
So your /etc/hosts file on your master machine will look something like this:
On your slave machine
If none of it works then you can go and change the master/localhost hostname to the ipaddress in core-site.xml, mapred-site.xml
解决办法如下:
http://thysmichels.com/2012/02/11/tips-running-hadoop-on-ubuntu/
When you get this Warning: $HADOOP_HOME is deprecated
Solution: add “export HADOOP_HOME_WARN_SUPPRESS=”TRUE”" in the hadoop-env.sh.
~~~~~~~~~~~~
顺带介绍其它常见错误及解决方法:
Tips running Hadoop on Ubuntu
Below is some tips when running Hadoop on Ubuntu. If you find some errors running Hadoop on Ubuntu please comment the problem and how you solved it.When you get this Warning: $HADOOP_HOME is deprecated
Solution: add “export HADOOP_HOME_WARN_SUPPRESS=”TRUE”" in the hadoop-env.sh.
Cannot create directory `/usr/local/hadoop/libexec/../logs
Solution: sudo chown -R hduser:hadoop /usr/local/hadoop/
Enter passphrase when running ./start-all.sh
Solution: ssh-keygen -t rsa -P “” Create a ssh-key without a password.
Warning: <property>/<configuration> not set
Solution: make sure <property> and <configuration> tags are populated in core.site.xml, mapred.site.xml, hdfs.site.xml
Send or retrieve file to and from HDFS
Solution:
Send file to HDFS > bin/hadoop dfs -put /home/someone/interestingFile.txt /user/yourUserName/
Get file from HDFS > bin/hadoop dfs -get foo localFile
ssh: connect to host localhost port 22: Connection refused
Solution: By default Ubuntu will not have ssh installed so you will have to install and start it.
Install > sudo apt-get install ssh
Start > sudo service ssh start
hadoop Incompatible namespaceIDs in /app/hadoop/tmp/*
Solution:
Stop the cluster: ./stop-dfs.sh
Delete the directory specified on the DataNode: rm -r /app/hadoop/tmp/*
Reformat the NameNode: hadoop namenode -format
OR
Stop the DataNode: ./stop.dfs.sh
Edit the value of namespaceID in /current/VERSION to match the value of the current NameNode.
Restart the DataNode: ./start.dfs.sh
hadoop java.net.UnknownHostException: ubuntu: ubuntu
Solution:
1. Add ubuntu as your localhost IP to your /etc/hosts file: sudo vi /etc/hosts
2. Restart your network: sudo /etc/init.d/networking restart
So your /etc/hosts file on your master machine will look something like this:
172.16.62.152 master 172.16.62.151 slave 172.16.62.152 ubuntu
On your slave machine
172.16.62.152 master 172.16.62.151 slave 172.16.62.151 ubuntu
If none of it works then you can go and change the master/localhost hostname to the ipaddress in core-site.xml, mapred-site.xml
相关文章推荐
- Warning:$HADOOP_HOME is deprecated问题
- Warning: $HADOOP_HOME is deprecated. hadoop1.0.4解决方法
- Warning: $HADOOP_HOME is deprecated.的原因以及解决方法
- hadoop1.2.1报Warning: $HADOOP_HOME is deprecated. 的解决方法
- Warning: $HADOOP_HOME is deprecated. hadoop解决方法补充版
- Warning: $HADOOP_HOME is deprecated. hadoop解决方法补充版
- Warning: $HADOOP_HOME is deprecated. hadoop解决方法补充版
- Warning: $HADOOP_HOME is deprecated. hadoop1.0.4解决方法
- 如何解决刚安装完hadoop报Warning: $HADOOP_HOME is deprecated的方法
- Warning: $HADOOP_HOME is deprecated. hadoop1.0.1解决方法
- Warning: $HADOOP_HOME is deprecated.的原因以及解决方法。
- hadoop常见问题——Warning: $HADOOP_HOME is deprecated.
- Warning: $HADOOP_HOME is deprecated.
- 在启动hadoop的时候,总是有警告Warning: $HADOOP_HOME is deprecated
- 关于 Warning: $HADOOP_HOME is deprecated.
- Warning: $HADOOP_HOME is deprecated. hadoop1.0....
- windows平台下配置hadoop的javahome时空格问题解决
- [hadoop]Warning: $HADOOP_HOME is deprecated. 关闭
- home/hadoop/tmp/mapred/system/jobtracker.info could only be replicated to 0 nodes, instead of 1问题解决
- hadoop学习笔记(1)-ssh到localhost问题的解决