您的位置:首页 > 运维架构 > Linux

Centos6.5 64位 安装Hadoop2.7.0, MapReduce日志分析, Hive2.1.0, JDBC连接Hive查询 (4)

2016-09-13 10:50 826 查看
第四篇 JDBC连接Hive查询

jdbc 连接hive 查询, 要做以下几个事情, 前面也曾提到:

1 在hadoop的core-site.xml 中增加配置

<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>

<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>

2 让hive支持自定义验证
http://blog.csdn.net/system1024/article/details/51955936

3 编写测试程序

package hive.server2.query;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;

public class ApiQueryTest {
private static String driverName = "org.apache.hive.jdbc.HiveDriver";
private static final Logger logger = LogManager.getLogger("ApiQueryTest");

public static void main(String[] args) {

try {
Class.forName(driverName);
Connection con = null;
con = DriverManager.getConnection("jdbc:hive2://10.68.128.215:10000", "root", "kangyun9413");
Statement stmt = con.createStatement();
ResultSet res = null;
String sql = "select sum(num) total, url, status from apis.api_logs group by status, url order by total desc limit 10";
System.out.println("Running: " + sql);
res = stmt.executeQuery(sql);
System.out.println("ok");
while (res.next()) {
System.out.println(res.getString(1) + "\t" + res.getString(2) + "\t" + res.getString(3));
}
} catch (Exception e) {
e.printStackTrace();
System.out.println("error");
}

}
}


运行:

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: