大讲台谈搭建Hadoop环境常见的问题及解决方案(二)
2015-08-10 15:48
330 查看
本文中所涉及到的问题均来自大讲台Hadoop学员的提问,下面是具体问题描述及解决方案,希望对正在学hadoop的学子有所帮助。
解决方案:ssh 没有配置成功。按照环境安装课件中的ssh配置步骤,再完整操作一次就可以了。
解决方案:设置本地 hadoop安装目录,目的是方便本地 MapReduce 项目的创建和程序的开发,创建 Map/Reduce 项目的时候自动会引进 hadoop 所需要的jar包,不需要额外的添加。MapReduce程序可以在本地测试运行也可以连接到 Hadoop 集群上面执行。
chaiying0@ubuntu:~$ /home/hadoop/hadoop-0.20.2/bin/start-all.sh
starting namenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-namenode-ubuntu.out
/home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-namenode-ubuntu.out: Permission denied
head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-namenode-ubuntu.out' for reading: No such file or directory
localhost: starting datanode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-datanode-ubuntu.out
localhost: /home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-datanode-ubuntu.out: Permission denied
localhost: head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-datanode-ubuntu.out' for reading: No such file or directory
localhost: starting secondarynamenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-secondarynamenode-ubuntu.out
localhost: /home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-secondarynamenode-ubuntu.out: Permission denied
localhost: head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-secondarynamenode-ubuntu.out' for reading: No such file or directory
starting jobtracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-jobtracker-ubuntu.out
/home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-jobtracker-ubuntu.out: Permission denied
head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-jobtracker-ubuntu.out' for reading: No such file or directory
localhost: starting tasktracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-tasktracker-ubuntu.out
localhost: /home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-tasktracker-ubuntu.out: Permission denied
localhost: head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-tasktracker-ubuntu.out' for reading: No such file or directory.
感觉里面有两个错误,一个是permission denied,另一个是no such file or directory。请问怎么解决啊.
解决方案:你当前用户没有权限打开 hadoop下的logs 文件。你可以使用 ls -al 查看一下,logs文件的权限是否属于当前用户。
你需要使用 sudo chownchaiying0:chaiying0 logs 命令 来赋予当前用户 chaiying0权限。
问题1:hadoop启动?
问题描述:前序步奏已经完成,然后使用命令 start-all.sh 启动 hadoop 的时候出现这个问题,请问怎么解决?解决方案:ssh 没有配置成功。按照环境安装课件中的ssh配置步骤,再完整操作一次就可以了。
问题2:搭建eclipse开发环境中,有一步时设置本地Hadoop安装运行目录
问题描述:请问下,设置本地Hadoop安装运行目录有什么用处?程序不都是连接到远程机器上运行么?解决方案:设置本地 hadoop安装目录,目的是方便本地 MapReduce 项目的创建和程序的开发,创建 Map/Reduce 项目的时候自动会引进 hadoop 所需要的jar包,不需要额外的添加。MapReduce程序可以在本地测试运行也可以连接到 Hadoop 集群上面执行。
问题3:hadoop 启动报错了,不知道为什么?
问题描述:chaiying0@ubuntu:~$ /home/hadoop/hadoop-0.20.2/bin/start-all.sh
starting namenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-namenode-ubuntu.out
/home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-namenode-ubuntu.out: Permission denied
head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-namenode-ubuntu.out' for reading: No such file or directory
localhost: starting datanode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-datanode-ubuntu.out
localhost: /home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-datanode-ubuntu.out: Permission denied
localhost: head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-datanode-ubuntu.out' for reading: No such file or directory
localhost: starting secondarynamenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-secondarynamenode-ubuntu.out
localhost: /home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-secondarynamenode-ubuntu.out: Permission denied
localhost: head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-secondarynamenode-ubuntu.out' for reading: No such file or directory
starting jobtracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-jobtracker-ubuntu.out
/home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-jobtracker-ubuntu.out: Permission denied
head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-jobtracker-ubuntu.out' for reading: No such file or directory
localhost: starting tasktracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-tasktracker-ubuntu.out
localhost: /home/hadoop/hadoop-0.20.2/bin/hadoop-daemon.sh: line 117: /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-tasktracker-ubuntu.out: Permission denied
localhost: head: cannot open `/home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-chaiying0-tasktracker-ubuntu.out' for reading: No such file or directory.
感觉里面有两个错误,一个是permission denied,另一个是no such file or directory。请问怎么解决啊.
解决方案:你当前用户没有权限打开 hadoop下的logs 文件。你可以使用 ls -al 查看一下,logs文件的权限是否属于当前用户。
你需要使用 sudo chownchaiying0:chaiying0 logs 命令 来赋予当前用户 chaiying0权限。
相关文章推荐
- linux 进程间通信之 消息队列
- 关于linux使用动态库进行进程间通讯
- OpenBLAS,Numpy,Scipy For Linux
- linux安装QQ
- linux随笔-3
- tomcat 启用Gzip 压缩进行优化
- Hadoop伪分布式部署和集群搭建
- Provisioning Services 7.6 入门到精通系列之二:基础架构环境
- Provisioning Services 7.6 入门到精通系列之二:基础架构环境
- 迅为4412开发板Linux驱动教程之内核开发基础
- Nginx配置文件详细说明
- fsi event loop
- JAVA实现监测tomcat是否宕机及控制重启的方法
- 我想在linux下用json
- 【转】Linux Page Cache的工作原理
- LINUX-KVM 基本原理及架构
- OpenGL+MFC对glMatrixMode(),glLoadIdentity()的理解
- windows配置nginx实现负载均衡集群
- 子元素设置margin-top,父元素也受影响
- 现代OpenGL教程 02 - 贴图