您的位置:首页 > 其它

Hive设置连接用户名和密码

2017-05-22 17:08 309 查看
Hive设置连接用户名和密码,操作相关步骤如下:

Hive-site.xml,缺省为NONE。此处改为CUSTOM

<property>
<name>hive.server2.authentication</name>
<value>CUSTOM</value>
<description>
Expects one of [nosasl, none, ldap, kerberos, pam, custom].
Client authentication types.
NONE: no authentication check
LDAP: LDAP/AD based authentication
KERBEROS: Kerberos/GSSAPI authentication
CUSTOM: Custom authentication provider
(Use with property hive.server2.custom.authentication.class)
PAM: Pluggable authentication module
NOSASL:  Raw transport
</description>
</property>


配置自定义验证类 hive-site.xml
<property>
<name>hive.server2.custom.authentication.class</name>
<value>org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticator</value>
</property>


自定义验证类代码:
package org.apache.hadoop.hive.contrib.auth;

import javax.security.sasl.AuthenticationException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.slf4j.Logger;

public class CustomPasswdAuthenticator implements org.apache.hive.service.auth.PasswdAuthenticationProvider{

private Logger LOG = org.slf4j.LoggerFactory.getLogger(CustomPasswdAuthenticator.class);

private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";

private Configuration conf=null;

@Override
public void Authenticate(String userName, String passwd)
throws AuthenticationException {
LOG.info("user: "+userName+" try login.");
String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));
if(passwdConf==null){
String message = "user's ACL configration is not found. user:"+userName;
LOG.info(message);
throw new AuthenticationException(message);
}
if(!passwd.equals(passwdConf)){
String message = "user name and password is mismatch. user:"+userName;
throw new AuthenticationException(message);
}
}

public Configuration getConf() {
if(conf==null){
this.conf=new Configuration(new HiveConf());
}
return conf;
}

public void setConf(Configuration conf) {
this.conf=conf;
}

}


此自定义类打成jar包放入到hive/lib目录下。不然会报错找不到class  这里我把class配置的名字故意写错了。 
2017-05-22T09:42:20,241 ERROR [HiveServer2-Handler-Pool: Thread-41] server.TThreadPoolServer: Error occurred during processing of message.
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatve not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
at org.apache.hive.service.auth.CustomAuthenticationProviderImpl.<init>(CustomAuthenticationProviderImpl.java:40)
at org.apache.hive.service.auth.AuthenticationProviderFactory.getAuthenticationProvider(AuthenticationProviderFactory.java:70)
at org.apache.hive.service.auth.AuthenticationProviderFactory.getAuthenticationProvider(AuthenticationProviderFactory.java:61)
at org.apache.hive.service.auth.PlainSaslHelper$PlainServerCallbackHandler.handle(PlainSaslHelper.java:106)
at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:103)
at org.apache.thrift.transport.TSaslTransport$SaslParticipant.evaluateChallengeOrResponse(TSaslTransport.java:539)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:283)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatve not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
... 13 more


hive-site.xml中配置用户密码 用户名zhangsan,密码 123456789  如果有多个就写多个property
<property>
<name>hive.jdbc_passwd.auth.zhangsan</name>
<value>123456789</value>
<description/>
</property>


java连接代码:
package com.hive;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;

import org.apache.hive.jdbc.HiveDriver;

public class HiveManage {

private static final String URLHIVE = "jdbc:hive2://192.168.184.130:10000/default";
private static Connection connection = null;

public static Connection getHiveConnection() {
if (null == connection) {
synchronized (HiveManage.class) {
if (null == connection) {
try {
Class.forName("org.apache.hive.jdbc.HiveDriver");
connection = DriverManager.getConnection(URLHIVE, "zhangsan", "123456789");
} catch (SQLException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
}
}
return connection;
}

public static void main(String args[]) throws SQLException{

//    	String sql = "select ipaddress,count(ipaddress) as count from apachelog "
//    			+ "group by ipaddress order by count desc";
String sql1="select ipaddress ,t_user,request,agent from apachelog limit 5";
PreparedStatement pstm = getHiveConnection().prepareStatement(sql1);
ResultSet rs= pstm.executeQuery(sql1);

while (rs.next()) {
System.out.println(rs.getString(1)+"	"+rs.getString(2)+
"		"+rs.getString(3)+"		"+rs.getString(4));
}
pstm.close();
rs.close();

}

}


当报如下错误的时候,说明目录没有权限
java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=zhangsan, access=EXECUTE, inode="/tmp/hive":root:supergroup:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1728)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3857)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:843)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)


进入到hadoop/bin目录执行
[root@master bin]#  ./hdfs dfs -chmod -R 777 /tmp


eclipse控制台显示:



如果用户名密码错误hive日志显示的就是我们自定义类打印的日志:



通过beeline连接
beeline> !connect jdbc:hive2://localhost:10000/default
Connecting to jdbc:hive2://localhost:10000/default
Enter username for jdbc:hive2://localhost:10000/default: zhangsan
Enter password for jdbc:hive2://localhost:10000/default: *********
Connected to: Apache Hive (version 2.1.1)
Driver: Hive JDBC (version 2.1.1)
17/05/22 10:50:45 [main]: WARN jdbc.HiveConnection: Request to set autoCommit to false; Hive does not support autoCommit=false.
Transaction isolation: TRANSACTION_REPEATABLE_READ









                                            
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: