您的位置:首页 > Web前端 > Node.js

hadoop集群配置无密码登陆 启动关闭namenode输入密码的解决方法

2017-09-12 20:07 661 查看
最近做hadoop学习,按网上的资料照本宣科做的,发现死活搞不懂为啥在启动和关闭namenode的时候都要输入密码。上网查询半天都没摸到头脑,后来仔细研究了下ssh后才发现,原来是一直没在namenode主机上加入ssh本地免验证authorized_keys文件,ssh 127.0.0.1
还是需要验证密码滴。。。。。很多网上的资料都没专门指出这一步。特此留贴,做个纪念。

[zzjlzx@master .ssh]$ start-all.sh

starting namenode, logging to /home/zzjlzx/hadoop/libexec/../logs/hadoop-zzjlzx-namenode-master.hadoop.com.out

slaves01.hadoop.com: 

slaves01.hadoop.com: starting datanode, logging to /home/zzjlzx/hadoop/logs/hadoop-zzjlzx-datanode-slaves01.hadoop.com.out

slaves02.hadoop.com: ssh: connect to host slaves02.hadoop.com port 22: No route to host

slaves03.hadoop.com: ssh: connect to host slaves03.hadoop.com port 22: No route to host

zzjlzx@master.hadoop.com's password: 

master.hadoop.com: starting secondarynamenode, logging to /home/zzjlzx/hadoop/libexec/../logs/hadoop-zzjlzx-secondarynamenode-master.hadoop.com.out

1. ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa

2.cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys 

这两步可以实现,本地 ssh localhost,配置完后,最好先 ssh 一下,这样可以生成一些东西到 ~/.ssh/known_hosts 

如果是 伪分布式的,ssh 已经配置完成了。
有的系统,有时候还需要将~/.ssh/authorized_keys(这个也弄了半天,ssh对文件的安全权限也是有要求的。) 的权限改为:600 

完全分布式还需配置如下:

3. 将 Namenode 的 密钥 复制自己的里面,最好 双向的都有,都可以互相ssh。

按一般的情况,现在已经配置完成了。

有时候,几个节点 用户名不一样,Namenode 启动其他节点的时候,默认用的自己下的 用户名。

比如:Namenode 用户名为 :hadoop ,node1 用户名为:user1

这样是:ssh hadoop@host 这样登录 的,启动时还会提示输入密码。

解决办法:slaves 里配置成:user1@node1

最好 各节点的用户名一样,Hadoop 里 还有 安全认证,这样好管理 。

hadoop集群配置无密码登陆范例:

1 生成私匙 id_rsa 与 公匙 id_rsa.pub 配置文件 
[hadoop@hadoop1 ~]$ ssh-keygen -t rsa 
Generating public/private rsa key pair. 
Enter file in which to save the key (/home/hadoop/.ssh/id_rsa): 
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/hadoop/.ssh/id_rsa. 
Your public key has been saved in /home/hadoop/.ssh/id_rsa.pub. 
The key fingerprint is: 
d6:63:76:43:e2:5b:8e:85:ab:67:a2:7c:a6:8f:23:f9 hadoop@hadoop1.test.com 
 
2 私匙 id_rsa 与 公匙 id_rsa.pub 配置文件 
[hadoop@hadoop1 ~]$ ls .ssh/ 
authorized_keys  id_rsa  id_rsa.pub  known_hosts 
3

把公匙文件上传到namenode服务器
$cd
$cd .ssh 
$cat id_dsa.pub >> authorized_keys
$ll
-rw-rw-r--. 1 zzjlzx zzjlzx  406 11月 29 01:01 authorized_keys

-rw-------. 1 zzjlzx zzjlzx 1675 11月 26 01:30 id_rsa

-rw-r--r--. 1 zzjlzx zzjlzx  406 11月 26 01:30 id_rsa.pub

-rw-r--r--. 1 zzjlzx zzjlzx 2440 11月 29 01:06 known_hosts

$chmod 600 authorized.keys

-rw-------. 1 zzjlzx zzjlzx  406 11月 29 01:01 authorized_keys

-rw-------. 1 zzjlzx zzjlzx 1675 11月 26 01:30 id_rsa

-rw-r--r--. 1 zzjlzx zzjlzx  406 11月 26 01:30 id_rsa.pub

-rw-r--r--. 1 zzjlzx zzjlzx 2440 11月 29 01:06 known_hosts

4 把公匙文件上传到datanode服务器 
[hadoop@hadoop1 ~]$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop@datanode1 
28 
hadoop@datanode1's password: 
Now try logging into the machine, with "ssh 'hadoop@datanode1'", and check in: 
 
  .ssh/authorized_keys 
 
to make sure we haven't added extra keys that you weren't expecting. 
 
[hadoop@hadoop1 ~]$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop@datanode2 
28 
hadoop@datanode2's password: 
Now try logging into the machine, with "ssh 'hadoop@datanode2'", and check in: 
 
  .ssh/authorized_keys 
 
to make sure we haven't added extra keys that you weren't expecting. 
 
[hadoop@hadoop1 ~]$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop@datanode3 
28 
hadoop@datanode3's password: 
Now try logging into the machine, with "ssh 'hadoop@datanode3'", and check in: 
 
  .ssh/authorized_keys 
 
to make sure we haven't added extra keys that you weren't expecting. 
 
[hadoop@hadoop1 ~]$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop@localhost 
28 
hadoop@localhost's password: 
Now try logging into the machine, with "ssh 'hadoop@localhost'", and check in: 
 
  .ssh/authorized_keys 
 
to make sure we haven't added extra keys that you weren't expecting. 
 
 
5 验证 
[hadoop@hadoop1 ~]$ ssh datanode1 
Last login: Thu Feb  2 09:01:16 2012 from 192.168.57.71 
[hadoop@hadoop2 ~]$ exit 
logout 
 
[hadoop@hadoop1 ~]$ ssh datanode2 
Last login: Thu Feb  2 09:01:18 2012 from 192.168
985d
.57.71 
[hadoop@hadoop3 ~]$ exit 
logout 
 
[hadoop@hadoop1 ~]$ ssh datanode3 
Last login: Thu Feb  2 09:01:20 2012 from 192.168.57.71 
[hadoop@hadoop4 ~]$ exit 
logout 
 
[hadoop@hadoop1 ~]$ ssh localhost 
Last login: Thu Feb  2 09:01:24 2012 from 192.168.57.71 
[hadoop@hadoop1 ~]$ exit 
logout 
 

参考: http://dngood.blog.51cto.com/446195/775368 http://blog.csdn.net/tyj2788540/article/details/8191666
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: